Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #1   Report Post  
Posted to microsoft.public.excel.programming
GB GB is offline
external usenet poster
 
Posts: 230
Default Memory Management

I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little as an
array of arrays, or as much as an array of arrays of arrays of arrays, etc...

My current usage of the application results in something like 60,000 array
cells being created. It takes no more than 5 minutes to create, read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free up the
memory. Excel (VBA) "takes care" of it. This process is taking about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are these:
1. Go to the bottom of each array and set the data = nothing, and then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be allocated
and that my efforts to remove each item will not have freed the memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual Basic,
C, or C++.
  #2   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 27,285
Default Memory Management

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little as an
array of arrays, or as much as an array of arrays of arrays of arrays,

etc...

My current usage of the application results in something like 60,000 array
cells being created. It takes no more than 5 minutes to create, read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free up

the
memory. Excel (VBA) "takes care" of it. This process is taking about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are these:
1. Go to the bottom of each array and set the data = nothing, and then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be allocated
and that my efforts to remove each item will not have freed the memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual

Basic,
C, or C++.



  #3   Report Post  
Posted to microsoft.public.excel.programming
GB GB is offline
external usenet poster
 
Posts: 230
Default Memory Management

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little as an
array of arrays, or as much as an array of arrays of arrays of arrays,

etc...

My current usage of the application results in something like 60,000 array
cells being created. It takes no more than 5 minutes to create, read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free up

the
memory. Excel (VBA) "takes care" of it. This process is taking about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are these:
1. Go to the bottom of each array and set the data = nothing, and then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be allocated
and that my efforts to remove each item will not have freed the memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual

Basic,
C, or C++.




  #4   Report Post  
Posted to microsoft.public.excel.programming
GB GB is offline
external usenet poster
 
Posts: 230
Default Memory Management


Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself. Based on a
15 sec determination, it removed 852kb, which should equate to 3.4 meg a
minute, and take no more than 7 minutes to remove all the data. However,
that must have been a fast 15 secs. I have been waiting at least five
minutes and it has not freed more than a quarter of the memory that was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is stuffed
in there, however clearing of the memory is taking way longer than the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little as an
array of arrays, or as much as an array of arrays of arrays of arrays,

etc...

My current usage of the application results in something like 60,000 array
cells being created. It takes no more than 5 minutes to create, read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free up

the
memory. Excel (VBA) "takes care" of it. This process is taking about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are these:
1. Go to the bottom of each array and set the data = nothing, and then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be allocated
and that my efforts to remove each item will not have freed the memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual

Basic,
C, or C++.




  #5   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 968
Default Memory Management

Well you could change your approach and go for a sparsely stored array with
the indexes needed for your arrays of arrays etc: 60000 array elements is
pretty small and the time taken to erase them should not even be noticeable.
You would just need to work out an indirect indexing system.

However I am still surprised it takes so long. What syntax are you using for
your arrays of arrays of arrays etc (both declaration and indexing)? Are you
sure its actually doing what you think its doing?


Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...

Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself. Based on
a
15 sec determination, it removed 852kb, which should equate to 3.4 meg a
minute, and take no more than 7 minutes to remove all the data. However,
that must have been a fast 15 secs. I have been waiting at least five
minutes and it has not freed more than a quarter of the memory that was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is
stuffed
in there, however clearing of the memory is taking way longer than the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little
as an
array of arrays, or as much as an array of arrays of arrays of
arrays,
etc...

My current usage of the application results in something like 60,000
array
cells being created. It takes no more than 5 minutes to create,
read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free
up
the
memory. Excel (VBA) "takes care" of it. This process is taking
about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are
these:
1. Go to the bottom of each array and set the data = nothing, and
then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very
short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be
allocated
and that my efforts to remove each item will not have freed the
memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual
Basic,
C, or C++.







  #6   Report Post  
Posted to microsoft.public.excel.programming
GB GB is offline
external usenet poster
 
Posts: 230
Default Memory Management

Well, the concept I went with was to use a dynamic array as a Complete Binary
Tree. My current usage of the program is to use unique data that is located
in three separate columns. I was trying to obtain a way to store and manage
nearly any number of unique data to make a unique key.

FYI, (Not sure if you know what a Complete Binary Tree is, so I'm going to
define it here.) A complete binary tree first of all has a root, which in
this case is array item 1. Then as each item is added, it is added to the
end of the array, and then sorted
into the correct location. From any node (i.e., the root), to the left is
less than that node, to the right is greater than that node. Except at
initial insertion. If I were doing deletions I also would take the last
array item and move it to the point of deletion, then perform a sort based on
that value.

As for array memory management of my Binary Tree I start off with one node,
and then each time that I get to a new level such that I will exceed the
current number of allocated cells then I increase the array by 2*size + 1.
So I start with 1 item, then have 3, then have 7, then have 15, etc. This
way I will not have to add to the array on every add, only when I have filled
a row.

For each line of data, I add the first unique key to the first array, then
the second unique key to the array of that array, and the third, etc.. etc..
Down to the last piece of information stored is the row number and nothing is
added to the subtree of the row number node.

In my case, this produces 4 nodes for the insertion of one item. Probably
actually 5, but only 4 have data. (I'm not in front of my code at the moment.)

As for references: The left tree node is calculated by using the array
location and taking the current node*2, the right tree node = Current Node*2
+ 1, a parent is Int(Current Node/2). Also my data structure is such that I
have Class BinaryTree that has an array of Nodes. The Node Class has an
array of BinaryTrees. This way I have a binary tree of binary trees. Seems
to be much quicker than my original method of just having a single array and
comparing each item to each item of another array. Except I'm still having
probs with the memory thing.

I have traced the program and frequently looked at the data that is stored,
and I have been able to trace down the memory path to see if I am getting
some adnauseum length of "trees". But I have not seen anything suspicious.
Now, I have noticed that when I run this, with an initial list of 36000 rows,
that the amount of memory used by Excel increases by 20 megs. As I said, it
takes ~20 minutes to delete all of that.


"Charles Williams" wrote:

Well you could change your approach and go for a sparsely stored array with
the indexes needed for your arrays of arrays etc: 60000 array elements is
pretty small and the time taken to erase them should not even be noticeable.
You would just need to work out an indirect indexing system.

However I am still surprised it takes so long. What syntax are you using for
your arrays of arrays of arrays etc (both declaration and indexing)? Are you
sure its actually doing what you think its doing?


Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...

Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself. Based on
a
15 sec determination, it removed 852kb, which should equate to 3.4 meg a
minute, and take no more than 7 minutes to remove all the data. However,
that must have been a fast 15 secs. I have been waiting at least five
minutes and it has not freed more than a quarter of the memory that was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is
stuffed
in there, however clearing of the memory is taking way longer than the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little
as an
array of arrays, or as much as an array of arrays of arrays of
arrays,
etc...

My current usage of the application results in something like 60,000
array
cells being created. It takes no more than 5 minutes to create,
read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free
up
the
memory. Excel (VBA) "takes care" of it. This process is taking
about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are
these:
1. Go to the bottom of each array and set the data = nothing, and
then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very
short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be
allocated
and that my efforts to remove each item will not have freed the
memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual
Basic,
C, or C++.






  #7   Report Post  
Posted to microsoft.public.excel.programming
GB GB is offline
external usenet poster
 
Posts: 230
Default Memory Management

I should probably also explain what I am trying to do with this program.

I essentially want to compare the data of two worksheets. Either they are
the same worksheet, or two different worksheets. The purpose of the
comparison is to identify duplicates of the same unique key and once the
duplicates are resolved to identify differences for each unique key. So if
Unique key is in worksheet1 and also in worksheet2 identify the differences
associated with that row. Obviously will not find any differences in the
unique key.

The actual time of data insertion and comparison is really quick now that I
have used the binary tree aspect. However as stated "cleaning" up is really
slow. I tried this morning to cycle through each record and delete from the
bottom of the tree up and it looks like the data trees are set up properly by
inspection of the call stack and the associated data. But if the tree is
large, (this test was on a 15000+ item tree), it is slow to delete, but if it
is small (1000+ items) the time is nearly instantaneous. Sitting and
thinking about that sentence, makes me think that maybe I should split the
larger tree into smaller trees and delete from that. Hmmmm....

All in all, I'm starting to think that this would be better performed in
Access. At least then I wouldn't have to really worry about the memory
deletion, just delete a table. But I'm not very proficient at Access
database programming. :\

"Charles Williams" wrote:

Well you could change your approach and go for a sparsely stored array with
the indexes needed for your arrays of arrays etc: 60000 array elements is
pretty small and the time taken to erase them should not even be noticeable.
You would just need to work out an indirect indexing system.

However I am still surprised it takes so long. What syntax are you using for
your arrays of arrays of arrays etc (both declaration and indexing)? Are you
sure its actually doing what you think its doing?


Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...

Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself. Based on
a
15 sec determination, it removed 852kb, which should equate to 3.4 meg a
minute, and take no more than 7 minutes to remove all the data. However,
that must have been a fast 15 secs. I have been waiting at least five
minutes and it has not freed more than a quarter of the memory that was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is
stuffed
in there, however clearing of the memory is taking way longer than the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little
as an
array of arrays, or as much as an array of arrays of arrays of
arrays,
etc...

My current usage of the application results in something like 60,000
array
cells being created. It takes no more than 5 minutes to create,
read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free
up
the
memory. Excel (VBA) "takes care" of it. This process is taking
about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are
these:
1. Go to the bottom of each array and set the data = nothing, and
then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very
short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be
allocated
and that my efforts to remove each item will not have freed the
memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual
Basic,
C, or C++.






  #8   Report Post  
Posted to microsoft.public.excel.programming
Naz Naz is offline
external usenet poster
 
Posts: 85
Default Memory Management

Is it worth using a Memory Managment Tool Such as FreeRamXP. and set it to
empty your RAM when a certain amount is left.

--

_______________________
Naz,
London


"GB" wrote:

I haven't found anything really on Memory management, but this is the
situation I am in.

I an nth level array of arrays. That means that I may have as little as an
array of arrays, or as much as an array of arrays of arrays of arrays, etc...

My current usage of the application results in something like 60,000 array
cells being created. It takes no more than 5 minutes to create, read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to free up the
memory. Excel (VBA) "takes care" of it. This process is taking about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts are these:
1. Go to the bottom of each array and set the data = nothing, and then if
the array has more than one item, redim the array to be of size 1.

Ultimately ending up with a single array item. that will take a very short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still be allocated
and that my efforts to remove each item will not have freed the memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like Visual Basic,
C, or C++.

Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
List Management ??? Daphtg Excel Worksheet Functions 2 September 6th 09 06:44 AM
XL 2007 - Out of Memory - memory leak/bug? PCLIVE Excel Discussion (Misc queries) 0 March 23rd 09 03:31 PM
Memory and File Size Management MMcCullough Excel Discussion (Misc queries) 12 August 12th 06 01:48 AM
Property Management nzatmj Excel Discussion (Misc queries) 1 March 22nd 06 05:26 PM
The instruction at "0x65255ac9" referenced memory at "0x00000008". The memory could not be read. Clikc OK to terminate etc Angus Comber[_2_] Excel Programming 1 November 7th 03 01:18 PM


All times are GMT +1. The time now is 05:21 AM.

Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright ©2004-2025 ExcelBanter.
The comments are property of their posters.
 

About Us

"It's about Microsoft Excel"