I have written
VB code to run Excel 2002 in a batch mode so that it does the
following work, say, 100 times. Using a given data file--call it data_i--I
create a worksheet with hundreds of charts on it. I save the workbook, then
copy that to a new file called, say, data_i.xls. Then I clear all the charts
on the worksheet and process data file data_(i+1).
I would expect that the memory usage in Excel would oscillate up and down,
but there is a definite significant upward trend superimposed on the
oscillation. Obviously Excel is caching stuff that I don't need between
iterations in the batch process. As an example I start with an excel file of
size 1 Mbyte, each iteration creates a 45 Mbyte xls file, but after 10-15
iterations the excel program I am running is using 500 Mbytes of memory!
Can anyone suggest a way to cause Excel to flush whatever it is keeping
around?