Author | Message | Time |
---|---|---|
Imperceptus | For say I have an Array and Everynow and then the last subscript of the array is empty. If I wanted to remove remove the last subscript of the array and keep the rest of the values in the array unchanged would this work? or is there a better way? [code] If MyArray(Ubound(Myarray)) = "" then Redim Preserve MyArray(Ubound(MyArray)-1) [/code] Edit, Used word Subsript instead of value. | February 14, 2004, 12:19 AM |
TheMinistered | If you are ever assigning any string in visual basic to empty, then I recommend using vbNullString. [code] If MyArray(Ubound(MyArray)) = vbNullString then ReDim Preserve MyArray(Ubound(MyArray)-1) [/code] | February 14, 2004, 12:24 AM |
Stealth | [quote author=Imperceptus link=board=31;threadid=5280;start=0#msg44121 date=1076717965] For say I have an Array and Everynow and then the last subscript of the array is empty. If I wanted to remove remove the last subscript of the array and keep the rest of the values in the array unchanged would this work? or is there a better way? [code] If MyArray(Ubound(Myarray)) = "" then Redim Preserve MyArray(Ubound(MyArray)-1) [/code] Edit, Used word Subsript instead of value. [/quote] 'Member' is more technically correct. :) Yes -- that should work. And, take TheMinistered's suggestion -- "" is still a string in memory, it's just empty. vbNullString is actually null. | February 14, 2004, 12:43 AM |
Imperceptus | Thanks to the Both of you, most appreciated. | February 14, 2004, 12:44 AM |
o.OV | If your array size changes often.. I recommend you don't redim for every item added/removed.. but rather.. start it off with .. lets say 10 Extra space and increment/decrement as needed without overdoing it. This will help your speed if you are working with a constantly changing array. | February 14, 2004, 3:14 AM |
Skywing | [quote author=o.OV link=board=31;threadid=5280;start=0#msg44170 date=1076728499] If your array size changes often.. I recommend you don't redim for every item added/removed.. but rather.. start it off with .. lets say 10 Extra space and increment/decrement as needed without overdoing it. This will help your speed if you are working with a constantly changing array. [/quote] In the general case it's most efficient to double the size. | February 14, 2004, 5:17 PM |
Adron | [quote author=Skywing link=board=31;threadid=5280;start=0#msg44267 date=1076779052] In the general case it's most efficient to double the size. [/quote] Why? | February 14, 2004, 5:33 PM |
o.OV | [quote author=Skywing link=board=31;threadid=5280;start=0#msg44267 date=1076779052] [quote author=o.OV link=board=31;threadid=5280;start=0#msg44170 date=1076728499] If your array size changes often.. I recommend you don't redim for every item added/removed.. but rather.. start it off with .. lets say 10 Extra space and increment/decrement as needed without overdoing it. This will help your speed if you are working with a constantly changing array. [/quote] In the general case it's most efficient to double the size. [/quote] But he sounded memory conscious and seemed he wanted to conserve his memory. I myself prefer to use memory generously but double the size? :o I guess it depends on the operation. | February 14, 2004, 7:51 PM |
Skywing | Yes. It can be proven that doubling the size is the most efficient way to do it. | February 14, 2004, 8:07 PM |
Adron | [quote author=Skywing link=board=31;threadid=5280;start=0#msg44317 date=1076789261] Yes. It can be proven that doubling the size is the most efficient way to do it. [/quote] That proof is what I'm after. It seems unobvious to me why that would be the case, and so I'd like to learn. Of course general case might perhaps imply something. Maybe it implies that the array will have to be copied? Maybe it implies that I don't know that the array will never have to grow more than 10% large than the current size? It'd be nice to know what kind of assumptions you're making. | February 14, 2004, 9:20 PM |
Skywing | At least, that was advertised as one of the things you'd learn in the algorithms course I'm taking. However, I'm not to that point yet, so I can't give you an answer just yet. | February 15, 2004, 6:05 PM |
Grok | None of my books (at least the parts I've read) suggests that. Someone who has Knuth algorithms books might look for array growth optimizations and array-related algorithms. | February 15, 2004, 10:19 PM |
Adron | [quote author=Skywing link=board=31;threadid=5280;start=0#msg44439 date=1076868314] At least, that was advertised as one of the things you'd learn in the algorithms course I'm taking. However, I'm not to that point yet, so I can't give you an answer just yet. [/quote] If low memory usage is a higher priority than copying speed, I'll doubt that. And especially consider the possibility of resizing the array in-place, like if you're allocating on a page level so you can just commit more pages, or move existing pages to another area in your virtual memory space without having to copy the actual data. | February 16, 2004, 1:59 AM |
Skywing | No, I'd expect it's optimized for reducing the number of memcpys rather than reducing memory usage. I think this algorithm is designed to be applicable for higher level situations where you won't necessarily be controlling where memory is allocated. | February 16, 2004, 6:43 PM |