LinkBack Thread Tools Search this Thread Display Modes
Prev Previous Post   Next Post Next
  #11   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 968
Default Memory Management

just use quicksort

"GB" wrote in message
...
Although, I was just thinking, I hadn't sorted the arrays. At the time I
was
writing the code, I just needed to be able to perform the task, not
necessarily with the optimum approach. I could revisit the sorting
algorithm
and see if I can get a faster comparison without using the trees, since it
is
taking so long to destroy the data. Now just to determine the fastest
sorting algorithm that uses minimum data storage for something of size
30,000
items. Ahh the eternal troubles of programming.


"Charles Williams" wrote:

Why not go with a simpler approach?

Build a dynamic output array of variants which is dimensioned (4,n) where
the 4 is Uniquekey,columnnumber,value from wksheet1, value from
wksheets2,
and n is the nth difference identified

process goes something like this
- get the unique keys into two arrays and sort them if neccessary
- loop on the arrays looking for matches
- when a match is found get the row of data from each sheet into two
arrays
of variants
- compare the two arrays of variants and whenever there is a difference
add
a record to the output array, resizing as required.

shouldnt take more seconds to process ...

Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...
I should probably also explain what I am trying to do with this program.

I essentially want to compare the data of two worksheets. Either they
are
the same worksheet, or two different worksheets. The purpose of the
comparison is to identify duplicates of the same unique key and once
the
duplicates are resolved to identify differences for each unique key.
So
if
Unique key is in worksheet1 and also in worksheet2 identify the
differences
associated with that row. Obviously will not find any differences in
the
unique key.

The actual time of data insertion and comparison is really quick now
that
I
have used the binary tree aspect. However as stated "cleaning" up is
really
slow. I tried this morning to cycle through each record and delete
from
the
bottom of the tree up and it looks like the data trees are set up
properly
by
inspection of the call stack and the associated data. But if the tree
is
large, (this test was on a 15000+ item tree), it is slow to delete, but
if
it
is small (1000+ items) the time is nearly instantaneous. Sitting and
thinking about that sentence, makes me think that maybe I should split
the
larger tree into smaller trees and delete from that. Hmmmm....

All in all, I'm starting to think that this would be better performed
in
Access. At least then I wouldn't have to really worry about the memory
deletion, just delete a table. But I'm not very proficient at Access
database programming. :\

"Charles Williams" wrote:

Well you could change your approach and go for a sparsely stored array
with
the indexes needed for your arrays of arrays etc: 60000 array elements
is
pretty small and the time taken to erase them should not even be
noticeable.
You would just need to work out an indirect indexing system.

However I am still surprised it takes so long. What syntax are you
using
for
your arrays of arrays of arrays etc (both declaration and indexing)?
Are
you
sure its actually doing what you think its doing?


Charles
______________________
Decision Models
FastExcel 2.1 now available
www.DecisionModels.com

"GB" wrote in message
...

Tried the Erase approach.

Looks like it is no faster than letting Excel handle it itself.
Based
on
a
15 sec determination, it removed 852kb, which should equate to 3.4
meg
a
minute, and take no more than 7 minutes to remove all the data.
However,
that must have been a fast 15 secs. I have been waiting at least
five
minutes and it has not freed more than a quarter of the memory that
was
created by adding all of the data in.

Actually runs suprisingly fast for the amount of information that is
stuffed
in there, however clearing of the memory is taking way longer than
the
program run time. Hmm.. Might have to try some other test(s). :\

Help is still needed.....

"GB" wrote:

Yes each array is dynamic.

So doing erase toplevelarray, will free all memory associated with
every
dynamic array below it?

Any idea if this is faster than the cleanup performed by finishing
the
program without calling the erase statement?



"Tom Ogilvy" wrote:

assume the arrays are dynamic.

erase toplevelarray

--
Regards,
Tom Ogilvy

"GB" wrote in message
...
I haven't found anything really on Memory management, but this
is
the
situation I am in.

I an nth level array of arrays. That means that I may have as
little
as an
array of arrays, or as much as an array of arrays of arrays of
arrays,
etc...

My current usage of the application results in something like
60,000
array
cells being created. It takes no more than 5 minutes to
create,
read, and
store data to all of the necessary cells, however....

When the program finishes, I currently am not doing anything to
free
up
the
memory. Excel (VBA) "takes care" of it. This process is
taking
about 30
minutes to destroy all of the data.

How can I free up the memory in a faster fashion? My thoughts
are
these:
1. Go to the bottom of each array and set the data = nothing,
and
then if
the array has more than one item, redim the array to be of size
1.

Ultimately ending up with a single array item. that will take a
very
short
amount of time for VBA to clear up.

2. Some method recommended by someone here.

My concern with my first method is that the memory will still
be
allocated
and that my efforts to remove each item will not have freed the
memory to
make the final closure of the program any faster.

Any ideas? I do not have access to more robust programs like
Visual
Basic,
C, or C++.











 
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
List Management ??? Daphtg Excel Worksheet Functions 2 September 6th 09 06:44 AM
XL 2007 - Out of Memory - memory leak/bug? PCLIVE Excel Discussion (Misc queries) 0 March 23rd 09 03:31 PM
Memory and File Size Management MMcCullough Excel Discussion (Misc queries) 12 August 12th 06 01:48 AM
Property Management nzatmj Excel Discussion (Misc queries) 1 March 22nd 06 05:26 PM
The instruction at "0x65255ac9" referenced memory at "0x00000008". The memory could not be read. Clikc OK to terminate etc Angus Comber[_2_] Excel Programming 1 November 7th 03 01:18 PM


All times are GMT +1. The time now is 02:53 AM.

Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright ©2004-2025 ExcelBanter.
The comments are property of their posters.
 

About Us

"It's about Microsoft Excel"