ExcelBanter

ExcelBanter (https://www.excelbanter.com/)
-   Excel Programming (https://www.excelbanter.com/excel-programming/)
-   -   Macros run slow after first run (https://www.excelbanter.com/excel-programming/314099-macros-run-slow-after-first-run.html)

Ken Loomis

Macros run slow after first run
 
The macros I have written open four files from a database and mostly just
combine and reformat them with some deletions based upon some simple
criteria and it removes duplicates. It also strips out some columns from a
couple of the files.

Then the report gets printed and all the data deleted and it is run again on
another set of four data files. The user may need to run this operation and
print reports for as many as 4 or 5 sets of data files. Each combined report
contains anywhere from 800 to 5000 rows of customer data.

After the first set of data is processed, the subsequent runs seem to be
much slower, even if the second or third data set is smaller than the first
one.

Is there something I need to do before processing these subsequent reports
that may make things run faster?

Thanks for any insights anyone can offer on this.

Ken Loomis



Leo Merikallio

Macros run slow after first run
 
Macros dont get slower by using them thats for sure :)

Take notice what has changed between the 1st and the 2nd run. For
example if theres a empty sheet before 1st run and full sheet after
1st then then it takes more time for the macro to clear the sheet, if
it has a functionality like that.

Yeah thats a stupid example but You catch the drift.

For a huge speed increase add this row as the first row of the macro:
application.screenupdating = false
and for the last row change the value to "true"
(if ur macro does not ask for user input in the middle)


"Ken Loomis" wrote in message ...
The macros I have written open four files from a database and mostly just
combine and reformat them with some deletions based upon some simple
criteria and it removes duplicates. It also strips out some columns from a
couple of the files.

Then the report gets printed and all the data deleted and it is run again on
another set of four data files. The user may need to run this operation and
print reports for as many as 4 or 5 sets of data files. Each combined report
contains anywhere from 800 to 5000 rows of customer data.

After the first set of data is processed, the subsequent runs seem to be
much slower, even if the second or third data set is smaller than the first
one.

Is there something I need to do before processing these subsequent reports
that may make things run faster?

Thanks for any insights anyone can offer on this.

Ken Loomis


Tom Ogilvy

Macros run slow after first run
 
At the top of the macro put something like

Worksheets("Report").DisplayPageBreaks = False

Do this for any sheet where pagebreaks are being displayed

Since you printed I assume pagebreaks are being displayed and these can
cause you macro to slow down since excel recalculates the page
layout/pagebreaks on every change.

--
Regards,
Tom Ogilvy


"Ken Loomis" wrote in message
...
The macros I have written open four files from a database and mostly just
combine and reformat them with some deletions based upon some simple
criteria and it removes duplicates. It also strips out some columns from a
couple of the files.

Then the report gets printed and all the data deleted and it is run again

on
another set of four data files. The user may need to run this operation

and
print reports for as many as 4 or 5 sets of data files. Each combined

report
contains anywhere from 800 to 5000 rows of customer data.

After the first set of data is processed, the subsequent runs seem to be
much slower, even if the second or third data set is smaller than the

first
one.

Is there something I need to do before processing these subsequent reports
that may make things run faster?

Thanks for any insights anyone can offer on this.

Ken Loomis





Ken Loomis

Macros run slow after first run
 
Thanks, Tom.

That sure made a difference.

I just ran two reports. The first with 138 pages and the second with only 28
pages.Until now, that second report took much longer to run than the first,
whenever I ran those data sets in that order.

After adding that line at the beginning of my code, the second data set runs
in less time than the first, like it does if I run that smaller report
first.

Ken Loomis

"Tom Ogilvy" wrote in message
...
At the top of the macro put something like

Worksheets("Report").DisplayPageBreaks = False

Do this for any sheet where pagebreaks are being displayed

Since you printed I assume pagebreaks are being displayed and these can
cause you macro to slow down since excel recalculates the page
layout/pagebreaks on every change.

--
Regards,
Tom Ogilvy


"Ken Loomis" wrote in message
...
The macros I have written open four files from a database and mostly just
combine and reformat them with some deletions based upon some simple
criteria and it removes duplicates. It also strips out some columns from
a
couple of the files.

Then the report gets printed and all the data deleted and it is run again

on
another set of four data files. The user may need to run this operation

and
print reports for as many as 4 or 5 sets of data files. Each combined

report
contains anywhere from 800 to 5000 rows of customer data.

After the first set of data is processed, the subsequent runs seem to be
much slower, even if the second or third data set is smaller than the

first
one.

Is there something I need to do before processing these subsequent
reports
that may make things run faster?

Thanks for any insights anyone can offer on this.

Ken Loomis








All times are GMT +1. The time now is 03:06 AM.

Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
ExcelBanter.com