View Single Post
  #2   Report Post  
Posted to microsoft.public.excel.programming
RB Smissaert RB Smissaert is offline
external usenet poster
 
Posts: 2,452
Default Fastest reading of large text files

Maybe this function can help you. It will put the text in a 1-D array.
You can specify the last row higher than the reality, the On Error Resume
Next
will take care of that. Once you have your data in the array you can do
whatever you
want with it and it will be much faster.

Function OpenTextFileToArray(ByVal txtFile As String, _
ByRef arr As Variant, _
ByVal LBRow As Long, _
ByVal UBRow As Long) As Variant

Dim hFile As Long
Dim r As Long

hFile = FreeFile

Open txtFile For Input As #hFile

On Error Resume Next

For r = LBRow To UBRow
Input #hFile, arr(r)
Next r

Close #hFile

OpenTextFileToArray = arr

End Function


RBS



"hmm" wrote in message
...
I have developed a macro that opens large text files (each has a list of
about 50,000 lines, almost all single numbers), reads them one at a time,
and
places each one in a successive column.

The method I used is to open a text file, copy column A, paste it into the
next available column in the main sheet, close the text file, and repeat
for
all text files in a folder (there could be up to 100 of them).

I am finding it could take about a minute to run this macro (for 35
files).

My question: will I gain speed by using the command "Open FileName For
Input
As #FileNum"? (I would try it myself; since there's a learning curve for
me,
I'm hoping to get somebody's input first.)

Thanks.