Home |
Search |
Today's Posts |
#1
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
The macros I have written open four files from a database and mostly just
combine and reformat them with some deletions based upon some simple criteria and it removes duplicates. It also strips out some columns from a couple of the files. Then the report gets printed and all the data deleted and it is run again on another set of four data files. The user may need to run this operation and print reports for as many as 4 or 5 sets of data files. Each combined report contains anywhere from 800 to 5000 rows of customer data. After the first set of data is processed, the subsequent runs seem to be much slower, even if the second or third data set is smaller than the first one. Is there something I need to do before processing these subsequent reports that may make things run faster? Thanks for any insights anyone can offer on this. Ken Loomis |
#2
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Macros dont get slower by using them thats for sure :)
Take notice what has changed between the 1st and the 2nd run. For example if theres a empty sheet before 1st run and full sheet after 1st then then it takes more time for the macro to clear the sheet, if it has a functionality like that. Yeah thats a stupid example but You catch the drift. For a huge speed increase add this row as the first row of the macro: application.screenupdating = false and for the last row change the value to "true" (if ur macro does not ask for user input in the middle) "Ken Loomis" wrote in message ... The macros I have written open four files from a database and mostly just combine and reformat them with some deletions based upon some simple criteria and it removes duplicates. It also strips out some columns from a couple of the files. Then the report gets printed and all the data deleted and it is run again on another set of four data files. The user may need to run this operation and print reports for as many as 4 or 5 sets of data files. Each combined report contains anywhere from 800 to 5000 rows of customer data. After the first set of data is processed, the subsequent runs seem to be much slower, even if the second or third data set is smaller than the first one. Is there something I need to do before processing these subsequent reports that may make things run faster? Thanks for any insights anyone can offer on this. Ken Loomis |
#3
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
At the top of the macro put something like
Worksheets("Report").DisplayPageBreaks = False Do this for any sheet where pagebreaks are being displayed Since you printed I assume pagebreaks are being displayed and these can cause you macro to slow down since excel recalculates the page layout/pagebreaks on every change. -- Regards, Tom Ogilvy "Ken Loomis" wrote in message ... The macros I have written open four files from a database and mostly just combine and reformat them with some deletions based upon some simple criteria and it removes duplicates. It also strips out some columns from a couple of the files. Then the report gets printed and all the data deleted and it is run again on another set of four data files. The user may need to run this operation and print reports for as many as 4 or 5 sets of data files. Each combined report contains anywhere from 800 to 5000 rows of customer data. After the first set of data is processed, the subsequent runs seem to be much slower, even if the second or third data set is smaller than the first one. Is there something I need to do before processing these subsequent reports that may make things run faster? Thanks for any insights anyone can offer on this. Ken Loomis |
#4
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Thanks, Tom.
That sure made a difference. I just ran two reports. The first with 138 pages and the second with only 28 pages.Until now, that second report took much longer to run than the first, whenever I ran those data sets in that order. After adding that line at the beginning of my code, the second data set runs in less time than the first, like it does if I run that smaller report first. Ken Loomis "Tom Ogilvy" wrote in message ... At the top of the macro put something like Worksheets("Report").DisplayPageBreaks = False Do this for any sheet where pagebreaks are being displayed Since you printed I assume pagebreaks are being displayed and these can cause you macro to slow down since excel recalculates the page layout/pagebreaks on every change. -- Regards, Tom Ogilvy "Ken Loomis" wrote in message ... The macros I have written open four files from a database and mostly just combine and reformat them with some deletions based upon some simple criteria and it removes duplicates. It also strips out some columns from a couple of the files. Then the report gets printed and all the data deleted and it is run again on another set of four data files. The user may need to run this operation and print reports for as many as 4 or 5 sets of data files. Each combined report contains anywhere from 800 to 5000 rows of customer data. After the first set of data is processed, the subsequent runs seem to be much slower, even if the second or third data set is smaller than the first one. Is there something I need to do before processing these subsequent reports that may make things run faster? Thanks for any insights anyone can offer on this. Ken Loomis |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Macros Running Sometimes Slow Sometimes Fast | Excel Discussion (Misc queries) | |||
Slow Macros ...3 | Excel Programming | |||
Slow macros cont..... | Excel Programming | |||
Excel Macros Slow in XP | Excel Programming | |||
Macros run slow in XP | Excel Programming |