Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #1   Report Post  
Posted to microsoft.public.excel.programming
hmm hmm is offline
external usenet poster
 
Posts: 175
Default Fastest reading of large text files

I have developed a macro that opens large text files (each has a list of
about 50,000 lines, almost all single numbers), reads them one at a time, and
places each one in a successive column.

The method I used is to open a text file, copy column A, paste it into the
next available column in the main sheet, close the text file, and repeat for
all text files in a folder (there could be up to 100 of them).

I am finding it could take about a minute to run this macro (for 35 files).

My question: will I gain speed by using the command "Open FileName For Input
As #FileNum"? (I would try it myself; since there's a learning curve for me,
I'm hoping to get somebody's input first.)

Thanks.
  #2   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 2,452
Default Fastest reading of large text files

Maybe this function can help you. It will put the text in a 1-D array.
You can specify the last row higher than the reality, the On Error Resume
Next
will take care of that. Once you have your data in the array you can do
whatever you
want with it and it will be much faster.

Function OpenTextFileToArray(ByVal txtFile As String, _
ByRef arr As Variant, _
ByVal LBRow As Long, _
ByVal UBRow As Long) As Variant

Dim hFile As Long
Dim r As Long

hFile = FreeFile

Open txtFile For Input As #hFile

On Error Resume Next

For r = LBRow To UBRow
Input #hFile, arr(r)
Next r

Close #hFile

OpenTextFileToArray = arr

End Function


RBS



"hmm" wrote in message
...
I have developed a macro that opens large text files (each has a list of
about 50,000 lines, almost all single numbers), reads them one at a time,
and
places each one in a successive column.

The method I used is to open a text file, copy column A, paste it into the
next available column in the main sheet, close the text file, and repeat
for
all text files in a folder (there could be up to 100 of them).

I am finding it could take about a minute to run this macro (for 35
files).

My question: will I gain speed by using the command "Open FileName For
Input
As #FileNum"? (I would try it myself; since there's a learning curve for
me,
I'm hoping to get somebody's input first.)

Thanks.


  #3   Report Post  
Posted to microsoft.public.excel.programming
hmm hmm is offline
external usenet poster
 
Posts: 175
Default Fastest reading of large text files

Sounds interesting. What is "arr" passed to the function?

"RB Smissaert" wrote:

Maybe this function can help you. It will put the text in a 1-D array.
You can specify the last row higher than the reality, the On Error Resume
Next
will take care of that. Once you have your data in the array you can do
whatever you
want with it and it will be much faster.

Function OpenTextFileToArray(ByVal txtFile As String, _
ByRef arr As Variant, _
ByVal LBRow As Long, _
ByVal UBRow As Long) As Variant

Dim hFile As Long
Dim r As Long

hFile = FreeFile

Open txtFile For Input As #hFile

On Error Resume Next

For r = LBRow To UBRow
Input #hFile, arr(r)
Next r

Close #hFile

OpenTextFileToArray = arr

End Function


RBS



"hmm" wrote in message
...
I have developed a macro that opens large text files (each has a list of
about 50,000 lines, almost all single numbers), reads them one at a time,
and
places each one in a successive column.

The method I used is to open a text file, copy column A, paste it into the
next available column in the main sheet, close the text file, and repeat
for
all text files in a folder (there could be up to 100 of them).

I am finding it could take about a minute to run this macro (for 35
files).

My question: will I gain speed by using the command "Open FileName For
Input
As #FileNum"? (I would try it myself; since there's a learning curve for
me,
I'm hoping to get somebody's input first.)

Thanks.



  #4   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 2,452
Default Fastest reading of large text files

Look at this example:

Sub Test()

Dim i As Long
Dim arr(1 To 10)

OpenTextFileToArray "C:\TestFile2.txt", arr, 1, 10

For i = 1 To 10
MsgBox arr(i)
Next

End Sub


Just try it out on file. You may have to adjust it according to your target
file.


RBS


"hmm" wrote in message
...
Sounds interesting. What is "arr" passed to the function?

"RB Smissaert" wrote:

Maybe this function can help you. It will put the text in a 1-D array.
You can specify the last row higher than the reality, the On Error Resume
Next
will take care of that. Once you have your data in the array you can do
whatever you
want with it and it will be much faster.

Function OpenTextFileToArray(ByVal txtFile As String, _
ByRef arr As Variant, _
ByVal LBRow As Long, _
ByVal UBRow As Long) As Variant

Dim hFile As Long
Dim r As Long

hFile = FreeFile

Open txtFile For Input As #hFile

On Error Resume Next

For r = LBRow To UBRow
Input #hFile, arr(r)
Next r

Close #hFile

OpenTextFileToArray = arr

End Function


RBS



"hmm" wrote in message
...
I have developed a macro that opens large text files (each has a list of
about 50,000 lines, almost all single numbers), reads them one at a
time,
and
places each one in a successive column.

The method I used is to open a text file, copy column A, paste it into
the
next available column in the main sheet, close the text file, and
repeat
for
all text files in a folder (there could be up to 100 of them).

I am finding it could take about a minute to run this macro (for 35
files).

My question: will I gain speed by using the command "Open FileName For
Input
As #FileNum"? (I would try it myself; since there's a learning curve
for
me,
I'm hoping to get somebody's input first.)

Thanks.




  #5   Report Post  
Posted to microsoft.public.excel.programming
hmm hmm is offline
external usenet poster
 
Posts: 175
Default Fastest reading of large text files

Thanks RB.

As I understand it, arr is an array of type Variant, which the function
OpenTextFileToArray is populating with the contents of the named text file.
But is it not an array in the PROGRAM? If so, how do I transfer this array
to a COLUMN in the WORKSHEET? That is my goal, to transfer a series of text
files into successive rows of a worksheet. I'm sorry if I did not say so
clearly.

"RB Smissaert" wrote:

Look at this example:

Sub Test()

Dim i As Long
Dim arr(1 To 10)

OpenTextFileToArray "C:\TestFile2.txt", arr, 1, 10

For i = 1 To 10
MsgBox arr(i)
Next

End Sub


Just try it out on file. You may have to adjust it according to your target
file.


RBS


"hmm" wrote in message
...
Sounds interesting. What is "arr" passed to the function?

"RB Smissaert" wrote:

Maybe this function can help you. It will put the text in a 1-D array.
You can specify the last row higher than the reality, the On Error Resume
Next
will take care of that. Once you have your data in the array you can do
whatever you
want with it and it will be much faster.

Function OpenTextFileToArray(ByVal txtFile As String, _
ByRef arr As Variant, _
ByVal LBRow As Long, _
ByVal UBRow As Long) As Variant

Dim hFile As Long
Dim r As Long

hFile = FreeFile

Open txtFile For Input As #hFile

On Error Resume Next

For r = LBRow To UBRow
Input #hFile, arr(r)
Next r

Close #hFile

OpenTextFileToArray = arr

End Function


RBS



"hmm" wrote in message
...
I have developed a macro that opens large text files (each has a list of
about 50,000 lines, almost all single numbers), reads them one at a
time,
and
places each one in a successive column.

The method I used is to open a text file, copy column A, paste it into
the
next available column in the main sheet, close the text file, and
repeat
for
all text files in a folder (there could be up to 100 of them).

I am finding it could take about a minute to run this macro (for 35
files).

My question: will I gain speed by using the command "Open FileName For
Input
As #FileNum"? (I would try it myself; since there's a learning curve
for
me,
I'm hoping to get somebody's input first.)

Thanks.






  #6   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 2,452
Default Fastest reading of large text files

OK, then you will have to alter it slightly.
Give the function a 2-D array and populate that.
So you will get something like this (aircode):

In the test sub:

Dim arr(1 to 10, 1 to 1)

OpenTextFileToArray "C:\TestFile2.txt", arr, 1, 10

range(cells(1), cells(10,1)) = arr


In the function:

For r = LBRow To UBRow
Input #hFile, arr(r,1)
Next r


RBS


"hmm" wrote in message
...
Thanks RB.

As I understand it, arr is an array of type Variant, which the function
OpenTextFileToArray is populating with the contents of the named text
file.
But is it not an array in the PROGRAM? If so, how do I transfer this
array
to a COLUMN in the WORKSHEET? That is my goal, to transfer a series of
text
files into successive rows of a worksheet. I'm sorry if I did not say so
clearly.

"RB Smissaert" wrote:

Look at this example:

Sub Test()

Dim i As Long
Dim arr(1 To 10)

OpenTextFileToArray "C:\TestFile2.txt", arr, 1, 10

For i = 1 To 10
MsgBox arr(i)
Next

End Sub


Just try it out on file. You may have to adjust it according to your
target
file.


RBS


"hmm" wrote in message
...
Sounds interesting. What is "arr" passed to the function?

"RB Smissaert" wrote:

Maybe this function can help you. It will put the text in a 1-D array.
You can specify the last row higher than the reality, the On Error
Resume
Next
will take care of that. Once you have your data in the array you can
do
whatever you
want with it and it will be much faster.

Function OpenTextFileToArray(ByVal txtFile As String, _
ByRef arr As Variant, _
ByVal LBRow As Long, _
ByVal UBRow As Long) As Variant

Dim hFile As Long
Dim r As Long

hFile = FreeFile

Open txtFile For Input As #hFile

On Error Resume Next

For r = LBRow To UBRow
Input #hFile, arr(r)
Next r

Close #hFile

OpenTextFileToArray = arr

End Function


RBS



"hmm" wrote in message
...
I have developed a macro that opens large text files (each has a list
of
about 50,000 lines, almost all single numbers), reads them one at a
time,
and
places each one in a successive column.

The method I used is to open a text file, copy column A, paste it
into
the
next available column in the main sheet, close the text file, and
repeat
for
all text files in a folder (there could be up to 100 of them).

I am finding it could take about a minute to run this macro (for 35
files).

My question: will I gain speed by using the command "Open FileName
For
Input
As #FileNum"? (I would try it myself; since there's a learning
curve
for
me,
I'm hoping to get somebody's input first.)

Thanks.





Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Fastest way to select large range (e.g. B3:F1002)? [email protected] Excel Discussion (Misc queries) 7 August 31st 07 04:36 PM
Reading from Numerous Text Files into Excel Column ep Excel Programming 1 November 8th 06 06:17 PM
Fastest way to sort large 2-D arrays? RB Smissaert Excel Programming 21 January 10th 05 07:12 PM
reading data from text files in VBA Arne[_3_] Excel Programming 3 January 27th 04 10:07 AM
Need FASTEST way to get data from a large closed Excel File Dave B[_5_] Excel Programming 13 October 29th 03 09:26 PM


All times are GMT +1. The time now is 03:00 AM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 ExcelBanter.
The comments are property of their posters.
 

About Us

"It's about Microsoft Excel"