(03-26-2024, 06:28 PM)TerryRitchie Wrote: I saw the use of REDIM twice as well and thought, huh, what's going on here. It's much more efficient than many of the ways I've used it in the past.
Normally I would use a counter, increase the array in size using that counter while preserving existing data, and then add the new data. That constant resizing and preserving in the loop took time.
Steve's method of creating a large array, reading the data in a loop, and then resizing afterward is much cleaner in my opinion and a technique I plan to implement from now on where appropriate.
If you guys would like to see an example of the speed difference in the two methods, here's one for you:
Code: (Select All)
ReDim foo(0) As Long 'the array that's going to hold an uncertain amount of data
Randomize Timer
limit = Rnd * 1000000 + 1000000 'now, nobody knows how large this limit is going to be. Right?
'Let's run some tests with various ways to fill an array with this uncertain amount of data
t# = Timer(0.001)
count = 0
Do
foo(count) = count
If count < limit Then
count = count + 1
ReDim _Preserve foo(count)
Else
Exit Do
End If
Loop
t1# = Timer(0.001)
Print Using "There were ###,###,### items in the array, and it took us ##.### seconds to fill it."; UBound(foo), t1# - t#
'Now, let's clear all that data and try a different method
ReDim foo(10000000) As Long 'big enough to hold the data, no matter what
t# = Timer(0.001)
count = 0 'reset the counter
Do
foo(count) = count
If count < limit Then
count = count + 1
Else
Exit Do
End If
Loop
ReDim _Preserve foo(count)
t1# = Timer(0.001)
Print Using "There were ###,###,### items in the array, and it took us ##.### seconds to fill it."; UBound(foo), t1# - t#
Run it a few times, as it generates a random sized array, and you can easily see the difference in performance here -- and this is with just a very simple LONG data structure. Make this an UDT and have it move/resize records that contain dozens of pieces of information and see how much the performance varies.
For 1.5 million items here, I'm getting times of 1.803 seconds to resize and rebild that LONG array one element at a time.
For the same 1.5 million items, I'm getting times of 0.024 seconds to load the data all at once and then resize it down after it's finished.
That's what? 80 times faster, give or take a few?
Just as simple to code. Much faster and more efficient. Is there any reason why one wouldn't use this type of method to load data into resizable arrays?
(03-26-2024, 06:47 PM)Kernelpanic Wrote: Yes, I modified ReDim to try it out.
An array created with ReDim is automatically declared as a dynamic array. So far so good. Now when I comment out Data I get an error saying "Read..." and I don't know why. Why is it “Out of Data” now?
The part with the for loop is clear, but the top part is not clear to me.
I'm afraid I'll have to create a corresponding program to understand what's going on.
Code: (Select All)
Screen _NewImage(800, 600, 32)
ReDim num(1000) As Integer
Do
Read num(count)
Print num
If num(count) <> -1 Then count = count + 1 Else Exit Do
Loop
ReDim _Preserve num(count) As Integer
Print "There were "; count;
Print "numbers counted and stored in our array (with end of data marker of -1)"
Print "They were:"
Print "(Index)", "(Value)"
For i = 0 To UBound(num)
Print i, num(i)
Next
End
'Data 1,2,3,4,5,6,7,8,9,-1
It's out of Data because you remarked out the DATA statement. See the last line in your code.