View Single Post
  #7   Report Post  
Posted to microsoft.public.excel.programming
JP[_4_] JP[_4_] is offline
external usenet poster
 
Posts: 897
Default Very Slow reading excel data into an array (while opened in ne

The proper way, which I'm sure you know, is to declare an object of
the type you want, then destroy it at the end of the macro:

Sub MyMacro()
Dim xl As Application ' Excel.Application
Set xl = Application
' your code here
Set xl = Nothing
End Sub

I think at least part of the problem is your goal statement. Your
actual goal appears to be to summarize data contained in several
hundred workbooks. But your goal is polluted with one particular path
towards it. Rethinking the path to your goal might help write better
code.

1) Is there a better way to accomplish this goal --
For example, can you use the Pivot Table's Multiple Consolidation
Ranges feature to grab the information from each workbook? The ranges
all have to have the same headers. If they aren't like that, can they
be made that way?

2) Is there anything you know about the files/folders that can be used
to optimize the code --
Are the files are kept in the same folder? If not, can they be moved
into the same folder for this purpose?
Are the files static, or are they updated periodically? If so, are all
of them updated, or just some?
Do the ranges you want to copy in each file have the same name and
size? Or is each range in each worksheet a different size?

3) Why does the information need to be read into an array?
Can you simply cut and paste the information into one super-worksheet?

4) Have you stepped through the code to see where it could be
shortened or changed?

5) The code section that starts "colFileName = col0: col0 = col0 + 1
'Filename "
What are you trying to do here? Can the information be handled some
other way?

6) What is the purpose of the code section that starts "For r = 1 To
rowsMaster " - you should try to lessen hits on the worksheet as much
as possible using the array technique I showed you. Then you can loop
through the array, which will be much faster than telling VBA to keep
hitting the worksheet, which (relatively speaking) is much more costly
in terms of speed and efficiency.


I found a routine that might help you. It loops through a folder and
you use it to do something with each file found.

http://www.vbaexpress.com/kb/getarticle.php?kb_id=245

HTH,
JP

On Sep 12, 10:44*pm, MikeZz wrote:
#2: *I used "Dim xlApp as New...." because I was having problems with my
original code not releasing memory.
If you search for another of my threads "What is proper way to Open Close
Release xls files in VBA?", you will see my original code. *The problem I was
originally having by not creating a new app instance was that the Task
Manager kept showing Excel memory going up. *At around my 150th file, it all
started going very slow. *The routine took about 45 minutes to complete with
TM showing over 100MB of Excel memory but it completed without memory crashes.

If you know of a better way to open and close several hundred excel files....
reading the data from sheet1 each time into an array, I'm all ears! *BTW, the
data array gets recycled each time it opens a new file so I don't have to
maintain all the data in memory... I process the file's data then can release
the original data.

#3: *Thanks for showing how to read a range into an array.... I've been
doing it the hard way for tooo long.

My Goal:
Read an excel file (with only one sheet) into an array.
Close the excel file.
Use my existing routines to extract pieces of data I need and put into a
"Summary" workbook.
Repeat this process up to several hundred times.
Not watch my hair turn grey while waiting for it to complete the job.