ExcelBanter

ExcelBanter (https://www.excelbanter.com/)
-   Excel Programming (https://www.excelbanter.com/excel-programming/)
-   -   Memory Management (https://www.excelbanter.com/excel-programming/332543-memory-management.html)

GB

Memory Management
 
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little as an
array of arrays, or as much as an array of arrays of arrays of arrays, etc...

My current usage of the application results in something like 60,000 array
cells being created. It takes no more than 5 minutes to create, read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free up the
memory. Excel (VBA) "takes care" of it. This process is taking about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are these:
1. Go to the bottom of each array and set the data = nothing, and then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be allocated
and that my efforts to remove each item will not have freed the memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual Basic,
C, or C++.

Tom Ogilvy

Memory Management
 
assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little as an
array of arrays, or as much as an array of arrays of arrays of arrays,

etc...

My current usage of the application results in something like 60,000 array
cells being created. It takes no more than 5 minutes to create, read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free up

the
memory. Excel (VBA) "takes care" of it. This process is taking about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are these:
1. Go to the bottom of each array and set the data = nothing, and then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be allocated
and that my efforts to remove each item will not have freed the memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual

Basic,
C, or C++.




Naz

Memory Management
 
Is it worth using a Memory Managment Tool Such as FreeRamXP. and set it to
empty your RAM when a certain amount is left.

--

_______________________
Naz,
London


"GB" wrote:

I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little as an
array of arrays, or as much as an array of arrays of arrays of arrays, etc...

My current usage of the application results in something like 60,000 array
cells being created. It takes no more than 5 minutes to create, read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free up the
memory. Excel (VBA) "takes care" of it. This process is taking about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are these:
1. Go to the bottom of each array and set the data = nothing, and then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be allocated
and that my efforts to remove each item will not have freed the memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual Basic,
C, or C++.


GB

Memory Management
 
Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little as an
array of arrays, or as much as an array of arrays of arrays of arrays,

etc...

My current usage of the application results in something like 60,000 array
cells being created. It takes no more than 5 minutes to create, read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free up

the
memory. Excel (VBA) "takes care" of it. This process is taking about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are these:
1. Go to the bottom of each array and set the data = nothing, and then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be allocated
and that my efforts to remove each item will not have freed the memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual

Basic,
C, or C++.





GB

Memory Management
 

Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself. Based on a
15 sec determination, it removed 852kb, which should equate to 3.4 meg a
minute, and take no more than 7 minutes to remove all the data. However,
that must have been a fast 15 secs. I have been waiting at least five
minutes and it has not freed more than a quarter of the memory that was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is stuffed
in there, however clearing of the memory is taking way longer than the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little as an
array of arrays, or as much as an array of arrays of arrays of arrays,

etc...

My current usage of the application results in something like 60,000 array
cells being created. It takes no more than 5 minutes to create, read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free up

the
memory. Excel (VBA) "takes care" of it. This process is taking about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are these:
1. Go to the bottom of each array and set the data = nothing, and then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be allocated
and that my efforts to remove each item will not have freed the memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual

Basic,
C, or C++.





Charles Williams

Memory Management
 
Well you could change your approach and go for a sparsely stored array with
the indexes needed for your arrays of arrays etc: 60000 array elements is
pretty small and the time taken to erase them should not even be noticeable.
You would just need to work out an indirect indexing system.

However I am still surprised it takes so long. What syntax are you using for
your arrays of arrays of arrays etc (both declaration and indexing)? Are you
sure its actually doing what you think its doing?


Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...

Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself. Based on
a
15 sec determination, it removed 852kb, which should equate to 3.4 meg a
minute, and take no more than 7 minutes to remove all the data. However,
that must have been a fast 15 secs. I have been waiting at least five
minutes and it has not freed more than a quarter of the memory that was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is
stuffed
in there, however clearing of the memory is taking way longer than the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little
as an
array of arrays, or as much as an array of arrays of arrays of
arrays,
etc...

My current usage of the application results in something like 60,000
array
cells being created. It takes no more than 5 minutes to create,
read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free
up
the
memory. Excel (VBA) "takes care" of it. This process is taking
about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are
these:
1. Go to the bottom of each array and set the data = nothing, and
then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very
short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be
allocated
and that my efforts to remove each item will not have freed the
memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual
Basic,
C, or C++.






GB

Memory Management
 
Well, the concept I went with was to use a dynamic array as a Complete Binary
Tree. My current usage of the program is to use unique data that is located
in three separate columns. I was trying to obtain a way to store and manage
nearly any number of unique data to make a unique key.

FYI, (Not sure if you know what a Complete Binary Tree is, so I'm going to
define it here.) A complete binary tree first of all has a root, which in
this case is array item 1. Then as each item is added, it is added to the
end of the array, and then sorted
into the correct location. From any node (i.e., the root), to the left is
less than that node, to the right is greater than that node. Except at
initial insertion. If I were doing deletions I also would take the last
array item and move it to the point of deletion, then perform a sort based on
that value.

As for array memory management of my Binary Tree I start off with one node,
and then each time that I get to a new level such that I will exceed the
current number of allocated cells then I increase the array by 2*size + 1.
So I start with 1 item, then have 3, then have 7, then have 15, etc. This
way I will not have to add to the array on every add, only when I have filled
a row.

For each line of data, I add the first unique key to the first array, then
the second unique key to the array of that array, and the third, etc.. etc..
Down to the last piece of information stored is the row number and nothing is
added to the subtree of the row number node.

In my case, this produces 4 nodes for the insertion of one item. Probably
actually 5, but only 4 have data. (I'm not in front of my code at the moment.)

As for references: The left tree node is calculated by using the array
location and taking the current node*2, the right tree node = Current Node*2
+ 1, a parent is Int(Current Node/2). Also my data structure is such that I
have Class BinaryTree that has an array of Nodes. The Node Class has an
array of BinaryTrees. This way I have a binary tree of binary trees. Seems
to be much quicker than my original method of just having a single array and
comparing each item to each item of another array. Except I'm still having
probs with the memory thing.

I have traced the program and frequently looked at the data that is stored,
and I have been able to trace down the memory path to see if I am getting
some adnauseum length of "trees". But I have not seen anything suspicious.
Now, I have noticed that when I run this, with an initial list of 36000 rows,
that the amount of memory used by Excel increases by 20 megs. As I said, it
takes ~20 minutes to delete all of that.


"Charles Williams" wrote:

Well you could change your approach and go for a sparsely stored array with
the indexes needed for your arrays of arrays etc: 60000 array elements is
pretty small and the time taken to erase them should not even be noticeable.
You would just need to work out an indirect indexing system.

However I am still surprised it takes so long. What syntax are you using for
your arrays of arrays of arrays etc (both declaration and indexing)? Are you
sure its actually doing what you think its doing?


Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...

Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself. Based on
a
15 sec determination, it removed 852kb, which should equate to 3.4 meg a
minute, and take no more than 7 minutes to remove all the data. However,
that must have been a fast 15 secs. I have been waiting at least five
minutes and it has not freed more than a quarter of the memory that was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is
stuffed
in there, however clearing of the memory is taking way longer than the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little
as an
array of arrays, or as much as an array of arrays of arrays of
arrays,
etc...

My current usage of the application results in something like 60,000
array
cells being created. It takes no more than 5 minutes to create,
read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free
up
the
memory. Excel (VBA) "takes care" of it. This process is taking
about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are
these:
1. Go to the bottom of each array and set the data = nothing, and
then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very
short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be
allocated
and that my efforts to remove each item will not have freed the
memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual
Basic,
C, or C++.







GB

Memory Management
 
I should probably also explain what I am trying to do with this program.

I essentially want to compare the data of two worksheets. Either they are
the same worksheet, or two different worksheets. The purpose of the
comparison is to identify duplicates of the same unique key and once the
duplicates are resolved to identify differences for each unique key. So if
Unique key is in worksheet1 and also in worksheet2 identify the differences
associated with that row. Obviously will not find any differences in the
unique key.

The actual time of data insertion and comparison is really quick now that I
have used the binary tree aspect. However as stated "cleaning" up is really
slow. I tried this morning to cycle through each record and delete from the
bottom of the tree up and it looks like the data trees are set up properly by
inspection of the call stack and the associated data. But if the tree is
large, (this test was on a 15000+ item tree), it is slow to delete, but if it
is small (1000+ items) the time is nearly instantaneous. Sitting and
thinking about that sentence, makes me think that maybe I should split the
larger tree into smaller trees and delete from that. Hmmmm....

All in all, I'm starting to think that this would be better performed in
Access. At least then I wouldn't have to really worry about the memory
deletion, just delete a table. But I'm not very proficient at Access
database programming. :\

"Charles Williams" wrote:

Well you could change your approach and go for a sparsely stored array with
the indexes needed for your arrays of arrays etc: 60000 array elements is
pretty small and the time taken to erase them should not even be noticeable.
You would just need to work out an indirect indexing system.

However I am still surprised it takes so long. What syntax are you using for
your arrays of arrays of arrays etc (both declaration and indexing)? Are you
sure its actually doing what you think its doing?


Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...

Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself. Based on
a
15 sec determination, it removed 852kb, which should equate to 3.4 meg a
minute, and take no more than 7 minutes to remove all the data. However,
that must have been a fast 15 secs. I have been waiting at least five
minutes and it has not freed more than a quarter of the memory that was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is
stuffed
in there, however clearing of the memory is taking way longer than the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little
as an
array of arrays, or as much as an array of arrays of arrays of
arrays,
etc...

My current usage of the application results in something like 60,000
array
cells being created. It takes no more than 5 minutes to create,
read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free
up
the
memory. Excel (VBA) "takes care" of it. This process is taking
about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are
these:
1. Go to the bottom of each array and set the data = nothing, and
then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very
short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be
allocated
and that my efforts to remove each item will not have freed the
memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual
Basic,
C, or C++.







Charles Williams

Memory Management
 
Why not go with a simpler approach?

Build a dynamic output array of variants which is dimensioned (4,n) where
the 4 is Uniquekey,columnnumber,value from wksheet1, value from wksheets2,
and n is the nth difference identified

process goes something like this
- get the unique keys into two arrays and sort them if neccessary
- loop on the arrays looking for matches
- when a match is found get the row of data from each sheet into two arrays
of variants
- compare the two arrays of variants and whenever there is a difference add
a record to the output array, resizing as required.

shouldnt take more seconds to process ...

Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...
I should probably also explain what I am trying to do with this program.

I essentially want to compare the data of two worksheets. Either they are
the same worksheet, or two different worksheets. The purpose of the
comparison is to identify duplicates of the same unique key and once the
duplicates are resolved to identify differences for each unique key. So
if
Unique key is in worksheet1 and also in worksheet2 identify the
differences
associated with that row. Obviously will not find any differences in the
unique key.

The actual time of data insertion and comparison is really quick now that
I
have used the binary tree aspect. However as stated "cleaning" up is
really
slow. I tried this morning to cycle through each record and delete from
the
bottom of the tree up and it looks like the data trees are set up properly
by
inspection of the call stack and the associated data. But if the tree is
large, (this test was on a 15000+ item tree), it is slow to delete, but if
it
is small (1000+ items) the time is nearly instantaneous. Sitting and
thinking about that sentence, makes me think that maybe I should split the
larger tree into smaller trees and delete from that. Hmmmm....

All in all, I'm starting to think that this would be better performed in
Access. At least then I wouldn't have to really worry about the memory
deletion, just delete a table. But I'm not very proficient at Access
database programming. :\

"Charles Williams" wrote:

Well you could change your approach and go for a sparsely stored array
with
the indexes needed for your arrays of arrays etc: 60000 array elements is
pretty small and the time taken to erase them should not even be
noticeable.
You would just need to work out an indirect indexing system.

However I am still surprised it takes so long. What syntax are you using
for
your arrays of arrays of arrays etc (both declaration and indexing)? Are
you
sure its actually doing what you think its doing?


Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...

Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself. Based
on
a
15 sec determination, it removed 852kb, which should equate to 3.4 meg
a
minute, and take no more than 7 minutes to remove all the data.
However,
that must have been a fast 15 secs. I have been waiting at least five
minutes and it has not freed more than a quarter of the memory that was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is
stuffed
in there, however clearing of the memory is taking way longer than the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with
every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is
the
situation I am in.

I an nth level array of arrays. That means that I may have as
little
as an
array of arrays, or as much as an array of arrays of arrays of
arrays,
etc...

My current usage of the application results in something like
60,000
array
cells being created. It takes no more than 5 minutes to create,
read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to
free
up
the
memory. Excel (VBA) "takes care" of it. This process is taking
about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are
these:
1. Go to the bottom of each array and set the data = nothing, and
then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a
very
short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be
allocated
and that my efforts to remove each item will not have freed the
memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like
Visual
Basic,
C, or C++.









GB

Memory Management
 
Actually that's what I started from. *smirk* But then the compare time was
in the 20-30 minute time, AND with a data destruction time of 10 minutes. I
realized that I needed a faster search time when given a particular key, so I
designed the binary tree concept, now the search and compare time has taken
no more than a few minutes, however the data destruction is taking much
longer... Looks like choosing the "better" of the two evils. :)

I've been contemplating a method to do the comparisons on an alternate
spreadsheet, and then perform the necessary comparisons from there, but I
thought that cell referencing was slower than memory comparisons. Again the
comparisons are quick, it's the destruction that has been slow and painful.
I'm concerned that my "user" will not be the least bit satisfied with this
route, though they may have to suck it up and use it anyways, if an alternate
solution can not be worked out.



"Charles Williams" wrote:

Why not go with a simpler approach?

Build a dynamic output array of variants which is dimensioned (4,n) where
the 4 is Uniquekey,columnnumber,value from wksheet1, value from wksheets2,
and n is the nth difference identified

process goes something like this
- get the unique keys into two arrays and sort them if neccessary
- loop on the arrays looking for matches
- when a match is found get the row of data from each sheet into two arrays
of variants
- compare the two arrays of variants and whenever there is a difference add
a record to the output array, resizing as required.

shouldnt take more seconds to process ...

Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...
I should probably also explain what I am trying to do with this program.

I essentially want to compare the data of two worksheets. Either they are
the same worksheet, or two different worksheets. The purpose of the
comparison is to identify duplicates of the same unique key and once the
duplicates are resolved to identify differences for each unique key. So
if
Unique key is in worksheet1 and also in worksheet2 identify the
differences
associated with that row. Obviously will not find any differences in the
unique key.

The actual time of data insertion and comparison is really quick now that
I
have used the binary tree aspect. However as stated "cleaning" up is
really
slow. I tried this morning to cycle through each record and delete from
the
bottom of the tree up and it looks like the data trees are set up properly
by
inspection of the call stack and the associated data. But if the tree is
large, (this test was on a 15000+ item tree), it is slow to delete, but if
it
is small (1000+ items) the time is nearly instantaneous. Sitting and
thinking about that sentence, makes me think that maybe I should split the
larger tree into smaller trees and delete from that. Hmmmm....

All in all, I'm starting to think that this would be better performed in
Access. At least then I wouldn't have to really worry about the memory
deletion, just delete a table. But I'm not very proficient at Access
database programming. :\

"Charles Williams" wrote:

Well you could change your approach and go for a sparsely stored array
with
the indexes needed for your arrays of arrays etc: 60000 array elements is
pretty small and the time taken to erase them should not even be
noticeable.
You would just need to work out an indirect indexing system.

However I am still surprised it takes so long. What syntax are you using
for
your arrays of arrays of arrays etc (both declaration and indexing)? Are
you
sure its actually doing what you think its doing?


Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...

Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself. Based
on
a
15 sec determination, it removed 852kb, which should equate to 3.4 meg
a
minute, and take no more than 7 minutes to remove all the data.
However,
that must have been a fast 15 secs. I have been waiting at least five
minutes and it has not freed more than a quarter of the memory that was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is
stuffed
in there, however clearing of the memory is taking way longer than the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with
every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is
the
situation I am in.

I an nth level array of arrays. That means that I may have as
little
as an
array of arrays, or as much as an array of arrays of arrays of
arrays,
etc...

My current usage of the application results in something like
60,000
array
cells being created. It takes no more than 5 minutes to create,
read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to
free
up
the
memory. Excel (VBA) "takes care" of it. This process is taking
about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are
these:
1. Go to the bottom of each array and set the data = nothing, and
then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a
very
short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be
allocated
and that my efforts to remove each item will not have freed the
memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like
Visual
Basic,
C, or C++.










GB

Memory Management
 
Although, I was just thinking, I hadn't sorted the arrays. At the time I was
writing the code, I just needed to be able to perform the task, not
necessarily with the optimum approach. I could revisit the sorting algorithm
and see if I can get a faster comparison without using the trees, since it is
taking so long to destroy the data. Now just to determine the fastest
sorting algorithm that uses minimum data storage for something of size 30,000
items. Ahh the eternal troubles of programming.


"Charles Williams" wrote:

Why not go with a simpler approach?

Build a dynamic output array of variants which is dimensioned (4,n) where
the 4 is Uniquekey,columnnumber,value from wksheet1, value from wksheets2,
and n is the nth difference identified

process goes something like this
- get the unique keys into two arrays and sort them if neccessary
- loop on the arrays looking for matches
- when a match is found get the row of data from each sheet into two arrays
of variants
- compare the two arrays of variants and whenever there is a difference add
a record to the output array, resizing as required.

shouldnt take more seconds to process ...

Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...
I should probably also explain what I am trying to do with this program.

I essentially want to compare the data of two worksheets. Either they are
the same worksheet, or two different worksheets. The purpose of the
comparison is to identify duplicates of the same unique key and once the
duplicates are resolved to identify differences for each unique key. So
if
Unique key is in worksheet1 and also in worksheet2 identify the
differences
associated with that row. Obviously will not find any differences in the
unique key.

The actual time of data insertion and comparison is really quick now that
I
have used the binary tree aspect. However as stated "cleaning" up is
really
slow. I tried this morning to cycle through each record and delete from
the
bottom of the tree up and it looks like the data trees are set up properly
by
inspection of the call stack and the associated data. But if the tree is
large, (this test was on a 15000+ item tree), it is slow to delete, but if
it
is small (1000+ items) the time is nearly instantaneous. Sitting and
thinking about that sentence, makes me think that maybe I should split the
larger tree into smaller trees and delete from that. Hmmmm....

All in all, I'm starting to think that this would be better performed in
Access. At least then I wouldn't have to really worry about the memory
deletion, just delete a table. But I'm not very proficient at Access
database programming. :\

"Charles Williams" wrote:

Well you could change your approach and go for a sparsely stored array
with
the indexes needed for your arrays of arrays etc: 60000 array elements is
pretty small and the time taken to erase them should not even be
noticeable.
You would just need to work out an indirect indexing system.

However I am still surprised it takes so long. What syntax are you using
for
your arrays of arrays of arrays etc (both declaration and indexing)? Are
you
sure its actually doing what you think its doing?


Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...

Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself. Based
on
a
15 sec determination, it removed 852kb, which should equate to 3.4 meg
a
minute, and take no more than 7 minutes to remove all the data.
However,
that must have been a fast 15 secs. I have been waiting at least five
minutes and it has not freed more than a quarter of the memory that was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is
stuffed
in there, however clearing of the memory is taking way longer than the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with
every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is
the
situation I am in.

I an nth level array of arrays. That means that I may have as
little
as an
array of arrays, or as much as an array of arrays of arrays of
arrays,
etc...

My current usage of the application results in something like
60,000
array
cells being created. It takes no more than 5 minutes to create,
read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to
free
up
the
memory. Excel (VBA) "takes care" of it. This process is taking
about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are
these:
1. Go to the bottom of each array and set the data = nothing, and
then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a
very
short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be
allocated
and that my efforts to remove each item will not have freed the
memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like
Visual
Basic,
C, or C++.










Charles Williams

Memory Management
 
just use quicksort

"GB" wrote in message
...
Although, I was just thinking, I hadn't sorted the arrays. At the time I
was
writing the code, I just needed to be able to perform the task, not
necessarily with the optimum approach. I could revisit the sorting
algorithm
and see if I can get a faster comparison without using the trees, since it
is
taking so long to destroy the data. Now just to determine the fastest
sorting algorithm that uses minimum data storage for something of size
30,000
items. Ahh the eternal troubles of programming.


"Charles Williams" wrote:

Why not go with a simpler approach?

Build a dynamic output array of variants which is dimensioned (4,n) where
the 4 is Uniquekey,columnnumber,value from wksheet1, value from
wksheets2,
and n is the nth difference identified

process goes something like this
- get the unique keys into two arrays and sort them if neccessary
- loop on the arrays looking for matches
- when a match is found get the row of data from each sheet into two
arrays
of variants
- compare the two arrays of variants and whenever there is a difference
add
a record to the output array, resizing as required.

shouldnt take more seconds to process ...

Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...
I should probably also explain what I am trying to do with this program.

I essentially want to compare the data of two worksheets. Either they
are
the same worksheet, or two different worksheets. The purpose of the
comparison is to identify duplicates of the same unique key and once
the
duplicates are resolved to identify differences for each unique key.
So
if
Unique key is in worksheet1 and also in worksheet2 identify the
differences
associated with that row. Obviously will not find any differences in
the
unique key.

The actual time of data insertion and comparison is really quick now
that
I
have used the binary tree aspect. However as stated "cleaning" up is
really
slow. I tried this morning to cycle through each record and delete
from
the
bottom of the tree up and it looks like the data trees are set up
properly
by
inspection of the call stack and the associated data. But if the tree
is
large, (this test was on a 15000+ item tree), it is slow to delete, but
if
it
is small (1000+ items) the time is nearly instantaneous. Sitting and
thinking about that sentence, makes me think that maybe I should split
the
larger tree into smaller trees and delete from that. Hmmmm....

All in all, I'm starting to think that this would be better performed
in
Access. At least then I wouldn't have to really worry about the memory
deletion, just delete a table. But I'm not very proficient at Access
database programming. :\

"Charles Williams" wrote:

Well you could change your approach and go for a sparsely stored array
with
the indexes needed for your arrays of arrays etc: 60000 array elements
is
pretty small and the time taken to erase them should not even be
noticeable.
You would just need to work out an indirect indexing system.

However I am still surprised it takes so long. What syntax are you
using
for
your arrays of arrays of arrays etc (both declaration and indexing)?
Are
you
sure its actually doing what you think its doing?


Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...

Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself.
Based
on
a
15 sec determination, it removed 852kb, which should equate to 3.4
meg
a
minute, and take no more than 7 minutes to remove all the data.
However,
that must have been a fast 15 secs. I have been waiting at least
five
minutes and it has not freed more than a quarter of the memory that
was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is
stuffed
in there, however clearing of the memory is taking way longer than
the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with
every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing
the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this
is
the
situation I am in.

I an nth level array of arrays. That means that I may have as
little
as an
array of arrays, or as much as an array of arrays of arrays of
arrays,
etc...

My current usage of the application results in something like
60,000
array
cells being created. It takes no more than 5 minutes to
create,
read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to
free
up
the
memory. Excel (VBA) "takes care" of it. This process is
taking
about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts
are
these:
1. Go to the bottom of each array and set the data = nothing,
and
then if
the array has more than one item, redim the array to be of size
1.

Ultimately ending up with a single array item. that will take a
very
short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still
be
allocated
and that my efforts to remove each item will not have freed the
memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like
Visual
Basic,
C, or C++.













All times are GMT +1. The time now is 05:03 AM.

Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
ExcelBanter.com