View Single Post
  #8   Report Post  
Posted to microsoft.public.excel.programming
Jim Thomlinson[_3_] Jim Thomlinson[_3_] is offline
external usenet poster
 
Posts: 983
Default Opening CSV file with 69000 + rows in Excel using VBA

Knowing now that he is willing to toss the data out you are correct. My
original assumption was the he would want to store the data, and if it could
be stored in one place that is usually best (instead of getting multiple
sheet involved). I then got myself on a one track (Access) solution, similar
to what I did for a previous project. I guess what I am tryin to say is you
are absolutely correct. ADO will work great in this instance.

"Tom Ogilvy" wrote:

If your going to use ADO, there is no need to involve Access.

--
Regards,
Tom Ogilvy


"Philip" wrote in message
...
Hi,

How could I create that Access db on the fly at runtime, use it to load

and
handle the data, then destroy it?

I tried using the Access 9 Library in VBA, and there is no 'dim oDB as New
Database' option...

Is that possible?

"Jim Thomlinson" wrote:

Any possibility to drop the CSV into MS Access and then query the data

out
from there. If not then you are left with reading the text file one line

at a
time and using the split function. Then pasting the array generated by

the
split function into the sheet, incrementing the sheet as necessary...

That is
kinda slow and ugly though... Access would be a much better option. You

could
even hook a pivot table up to the Access database if you want. A pivot

coming
out of Access is good for at least about 650,000 records with reasonable
performance.

HTH

"Philip" wrote:

Hi all,

I have to open csv files with possibly more than 65000 rows of data in

them,
and all the rows greater than 1 worksheet have to be put on the next
sheet...when that is full I have to add another sheet and so on...

Due to the limitations of Excel, I can't even tell the text Import
(Querytables) method to start at row 65000 as the textFileStartRow

parameter
is supposed to be an integer (DOH !) - who came up with that one?

So, what is the best option, open the files using OLEDB Text Provider

in ADO
and load from recordset to read them in chunks of 65000 odd records,

or is
there a better way?

Or should I use the Textfile Import Method, and read it in in chunks

of say
30000 by setting the textFileStartRow property to current Value +

30000 for
each iteration until completed?

thanks for any help or ideas or sympathy...

Philip