Home |
Search |
Today's Posts |
|
#1
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
I have a macro which calls the Sub below 17,000+ times to get data from an
equal number of small files. The macro always starts out running very fast, but quickly slows to a snail's pace. The only way I have found to speed it up again is to stop the macro, save the file and close Excel, then restart Excel, reload the worksheet and have it pick up where it left off. Clearly some resource is being used up and not released between calls to the Sub. Perhaps I endup with a myriad of open files?? Any help in resolving this issue would be much appreciated. Thanks. Here is the Sub: Sub GetParcelInfoFromFile(Parcel As String) Dim FileName As String Dim qt As QueryTable On Error GoTo NoQT 'Try to speed this up by deleting quries For Each qt In Worksheets("Temp").QueryTables qt.Delete Next qt NoQT: On Error GoTo 0 'Build name of file containing data FileName = "F:\IOW Harvest Data\" & Left(Parcel, 1) & "\" & Parcel & ".txt" If Len(Dir(FileName)) = 0 Then 'If file does not exist Range("Temp!A1") = "No File" Exit Sub End If Worksheets("Temp").Select 'Get data into this sheet With Worksheets("Temp").QueryTables.Add(Connection:="TE XT;" & FileName, Destination:=Range("A1")) .Name = Parcel .FieldNames = True .RowNumbers = False .FillAdjacentFormulas = False .PreserveFormatting = True .RefreshOnFileOpen = False .RefreshStyle = xlInsertDeleteCells .SavePassword = False .SaveData = True .AdjustColumnWidth = True .RefreshPeriod = 0 .TextFilePromptOnRefresh = False .TextFilePlatform = xlWindows .TextFileStartRow = 1 .TextFileParseType = xlDelimited .TextFileTextQualifier = xlTextQualifierDoubleQuote .TextFileConsecutiveDelimiter = False .TextFileTabDelimiter = True .TextFileSemicolonDelimiter = False .TextFileCommaDelimiter = False .TextFileSpaceDelimiter = False .TextFileColumnDataTypes = Array(1, 1, 1, 1, 1, 1, 1, 1, 1) .Refresh BackgroundQuery:=False End With Worksheets(Orig).Select End Sub |
#2
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Have you tried just opening the file, get the data, store the data, close
the file. Not sure what you gain by creating querytables. Also, have you checked the names collection to see if you are creating a defined name for each querytable. Is calculation set to manual? -- Regards, Tom Ogilvy wrote in message ... I have a macro which calls the Sub below 17,000+ times to get data from an equal number of small files. The macro always starts out running very fast, but quickly slows to a snail's pace. The only way I have found to speed it up again is to stop the macro, save the file and close Excel, then restart Excel, reload the worksheet and have it pick up where it left off. Clearly some resource is being used up and not released between calls to the Sub. Perhaps I endup with a myriad of open files?? Any help in resolving this issue would be much appreciated. Thanks. Here is the Sub: Sub GetParcelInfoFromFile(Parcel As String) Dim FileName As String Dim qt As QueryTable On Error GoTo NoQT 'Try to speed this up by deleting quries For Each qt In Worksheets("Temp").QueryTables qt.Delete Next qt NoQT: On Error GoTo 0 'Build name of file containing data FileName = "F:\IOW Harvest Data\" & Left(Parcel, 1) & "\" & Parcel & ".txt" If Len(Dir(FileName)) = 0 Then 'If file does not exist Range("Temp!A1") = "No File" Exit Sub End If Worksheets("Temp").Select 'Get data into this sheet With Worksheets("Temp").QueryTables.Add(Connection:="TE XT;" & FileName, Destination:=Range("A1")) .Name = Parcel .FieldNames = True .RowNumbers = False .FillAdjacentFormulas = False .PreserveFormatting = True .RefreshOnFileOpen = False .RefreshStyle = xlInsertDeleteCells .SavePassword = False .SaveData = True .AdjustColumnWidth = True .RefreshPeriod = 0 .TextFilePromptOnRefresh = False .TextFilePlatform = xlWindows .TextFileStartRow = 1 .TextFileParseType = xlDelimited .TextFileTextQualifier = xlTextQualifierDoubleQuote .TextFileConsecutiveDelimiter = False .TextFileTabDelimiter = True .TextFileSemicolonDelimiter = False .TextFileCommaDelimiter = False .TextFileSpaceDelimiter = False .TextFileColumnDataTypes = Array(1, 1, 1, 1, 1, 1, 1, 1, 1) .Refresh BackgroundQuery:=False End With Worksheets(Orig).Select End Sub |
#3
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Thanks for your reply. It was a big help, but there is still a problem:
On Thu, 13 Oct 2005 20:51:10 -0400, "Tom Ogilvy" wrote: Have you tried just opening the file, get the data, store the data, close the file. No, the files are .txt files that I wanted to load into a worksheet in the same workbook, but I could try just opening them, grabbing the data I need and closing them. Not sure what you gain by creating querytables. Did I answer this above? Perhaps there is something I don't understand. Also, have you checked the names collection to see if you are creating a defined name for each querytable. I did check and that was a significant part of the problem. Thanks. Now execution decreases much less than it did but it still decreases. Is calculation set to manual? There are no formulas in any cell of the workbook. Never-the-less I did try setting Calc to manual and the speed decrease was not changed Thanks again! |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Need to Speed Up A Code | Excel Worksheet Functions | |||
Running speed of Micrsoft Query with Multiple parameters | Excel Discussion (Misc queries) | |||
Speed up code | Excel Programming | |||
How can I Improve query speed? | Excel Programming | |||
Macros go slower than a turtle in mud | Excel Programming |