Ideas for quicker way to populate adjacent cells with array el
First, excel may not be the best application for this amount of data.
I notice that excel starts to slow down considerably when I have lots of rows of
data (no formulas even).
I don't use Access (and maybe your customers don't either), but have you thought
of using a different application.
Bing wrote:
Hi Dave,
Thanks for responding.
The reason why i didn't use workbooks.open or workbooks.opentext is because
of several reasons:
1)Some files are much more than 65K lines. Since excel has a limit of 65536,
i need to import the file across multiple worksheets.
2)Yes, i wanted more control. Although i haven't actually tested to
determine which method is faster (ie. using workbooks.open first, and then
combining the columns later versus using Line Input, combining columns first,
then pasting to worksheet later) my gut feeling would be that the former is
quicker in loading lines into workbook, but the latter would be quicker in
combining the columns). But i think you may be right that as the number of
lines increase it would be quicker using the workbooks.open first, combine
later. But due to 1) i can't seem to find a alternative to doing the way
that i had outlined earliar (reading in manually line by line so that i can
add a new worksheets in the even the number of lines is 64K).
This application is being developed for customers so i'd like to minimize
steps that they have to perform. ie. i dont want them to have to separate
the large file into individual 64K line subfiles.
"Dave Peterson" wrote:
Is there a reason you couldn't just use:
workbooks.open filename:="C:\myfolder\inputname.csv"
If you want more control of individual field formats, rename your .csv file to
..txt and then record a macro when you do file|open.
I would think that opening multiple files this way (and combining later) would
be quicker (even noticeable when the number of lines gets larger).
Another option would be to use an old DOS command to combine your .csv files
into a single file, then import that once.
If the files are in the same folder and all files named *.csv should be
combined:
Shell to DOS
traverse to that folder
copy *.csv all.txt
Then exit the command prompt.
Back to excel and import All.Txt.
===
You could even mechanize this in your code via the shell command (and maybe an
application.wait to make sure the files are combined before continuing).
Bing wrote:
Hi,
I written a macro to read in line by line multiple CSV files of varying
lengths.
Can anyone suggest a faster way in terms of runtime execution time to speed
this code up?
ie. I thought of reading in multiple lines in bulk to reduce I/O but VBA
doesn't seem to have a bulk read mode.
Basically i do the following:
Dim columnArrays as Variant
Dim newColumnArray as Variant
.
.
Do until EOF(inputFile)
Line Input #inputFile, inputLine
columnArray = Split(inputLine, ",")
....merge/massage/apply functions to certain columns in columnArray
and copy into newColumnArray
for i = 0 to Ubound(newColumnArray)
aRange.Offset(0,i).Value = newColumnArray(i)
next i
set aRnage = aRange.Offset(1,0)
Loop
Thanks to everyone for their valuable insights.
--
Dave Peterson
--
Dave Peterson
|