Home |
Search |
Today's Posts |
#1
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
I found a serious problems with my VBA in a workbook that generates reports.
There is a "Master" workbook, that the users open and then they import several text files to create the report. They then print that report and save it. I never thought to remove all the code from workbook before they saved it. Now I find that if they open one of those reports, all the information could be wiped out. I have corrected the problem for any new report they run, but the problem is still stored away in all the old reports. I figured out that if I open one of those old reports and simple remove all the code, then save the file, everything is fine. So what I have done is build a new report generator, that will build a list of all the potentially problem files the first time the report generator is run. I use FileSearch to look for "*.xls" files that contain the specific sub name. All these files names (with paths) are added to a hidden worksheet. Then every time the report builder is opened, I plan to correct several of these problematic files, until they have all been fixed. I could do this all at once, but the problem is how long it ties up the user's machine. Just searching the My Documents folder for problematic files takes over an hour. And, fixing all of the files at once will take even longer. I tried just searching for the "*.xls" files and then going back to search those for the specific sub name but just finding the "*.xls" files takes even longer. I can not count on the report file names being of a specific format. each user uses a different naming convention. Can someone suggest anything faster than this code: With Application.FileSearch .NewSearch .LookIn = "C:\My Documents" .SearchSubFolders = True .TextOrProperty = "BuildStreetsReports" .MatchTextExactly = False .filename = "*.xls" .Execute TIA, Ken |
Thread Tools | Search this Thread |
Display Modes | |
|
|