![]() |
Import extrenal data exceeds 65536
Hi Stacey:
"Stacey" wrote: I realize Excel has a row limitation of 64K here, Are there more elswhere? I am attempting to retrieve a result set (from MSQuery) which has more rows than Excel allows on one sheet Excel allows 65k+ rows in one column you can use the other columns Basically I'd like to import all rows from the result set into Excel, using multiple sheets. Does any one have any sample code for auto generating a new sheet based on resultSet data (rather than reading from a text file?) Well yes I do , using ADO and you could bring it in to the same sheet or whereever you want it. For Example: This puts the first 65k + 3 field Query into col a b c cnt = Rs.RecordCount '//get record count if you need them MsgBox "Count = " & cnt Dim irow As Long irow = 3 With Rs .MoveFirst Do Until irow = 65536 Worksheets("Sheet3").Range("A" & irow) = Rs("ProductID") Worksheets("Sheet3").Range("B" & irow) = Rs("ProductName") Worksheets("Sheet3").Range("C" & irow) = Rs("UnitPrice") Debug.Print Rs("ProductName") .MoveNext irow = irow + 1 Loop End With cnt2 = Rs.AbsolutePosition '// where we at in the recordset MsgBox "Position = " & cnt2 '/// here we move to d e f for the balance of the recordset '/// see the recordset pointer remembers it's position or you can '/// get it with the above code "Rs.AbsolutePosition" irow = 3 With Rs Do Until .EOF Worksheets("Sheet3").Range("D" & irow) = Rs("ProductID") Worksheets("Sheet3").Range("E" & irow) = Rs("ProductName") Worksheets("Sheet3").Range("F" & irow) = Rs("UnitPrice") .MoveNext irow = irow + 1 Loop End With That's one way, Stacey. If it happens to be an Access db post back and I'll post you some connection code that you can play around with. By the way I brought 88.000, 3 field records into columns a b c and d e f in 64 seconds on an old laptop 1.6 I think. Good Luck TK |
Import extrenal data exceeds 65536
"TK" wrote ...
This puts the first 65k + 3 field Query into col a b c <<snipped By the way I brought 88.000, 3 field records into columns a b c and d e f in 64 seconds on an old laptop 1.6 I think. Here's an extension to this idea. Assuming the table has a key, you could sort on it, navigate to row 65535 (or whatever), set an appropriate Filter using the key values at rows 1 and 65535 so the recordset now contains only 65535 or less rows, use GetRows or CopyFromRecordset to read all (visible) rows to the worksheet, remove the Filter and repeat until EOF. I do not have the code/database to test (Products in my northwind has only 77 rows <g) but I would expect this to be faster than writing line-by-line if a suitable key was available. Jamie. -- |
Import extrenal data exceeds 65536
"Jamie Collins" wrote: "TK" wrote ... This puts the first 65k + 3 field Query into col a b c <<snipped By the way I brought 88.000, 3 field records into columns a b c and d e f in 64 seconds on an old laptop 1.6 I think. Here's an extension to this idea. Assuming the table has a key, you could sort on it, navigate to row 65535 (or whatever), set an appropriate Filter using the key values at rows 1 and 65535 so the recordset now contains only 65535 or less rows, use GetRows or CopyFromRecordset to read all (visible) rows to the worksheet, remove the Filter and repeat until EOF. I do not have the code/database to test (Products in my northwind has only 77 rows <g) but I would expect this to be faster than writing line-by-line if a suitable key was available. Jamie. -- Hi Jamie: I see post all the time referring to the 65k limit in excel being referred to incorrectly. It is, as you know a column limitation. I posted the procedure merely to prove that point. By no means do I advocate it, and I even posted the time it took. My advise €śfor what its worth, if anything€ť has always been to narrow the cursor at the db level. Also, from my experience I dont think the answerer is GetRows, I have timed CopyFromRecordset and GetRows and CopyFromRecordset was faster in the test and you dont have an array to work with if you want to read it back in. Good Luck TK |
Import extrenal data exceeds 65536
Thank you all for the great responses. Your feedback is greatly
appreciated and is definately getting me on the right track. In response to some of the open ends I left: You don't mention the DB being used (assuming that's where the data is from). Database is SQLServer. MSQuery is just what Excel "seems" to use to write the SQL statment with. I'm open to others. One question might be what you intend to do with it once it's in Excel ? - when split over multiple sheets it's not going to be very useable. Hey, I don't make the demands, I just attempt to fill them. Personally, how is 65K rows readable/usable?! I've used pivot tables where I can to reduce the number of rows retrieved in hoping it is more useful to the end user. I'm not familiar with ADO but am seeing it could be very useful here and am giving it a try. Thanks again! Stacey |
Import extrenal data exceeds 65536
Jamie:
I thought you might find this interesting. "the db I used here is trash, just one I use to play around with to many dups ect to be of value or to share 1 table 3 fields thats why I couldn't set the value at 65+" I wrote a simple select query in Access "SELECT DISTINCTROW TOP 30500 Products.ProductID, Products.ProductName, Products.UnitPrice FROM Products;" Called the query from excel with the connection object, used CopyFromRecordset to write it into a sheet it took 1.3 sec Using the same select statement and CopyFromRecordset in excel to write to a sheet it took .59 sec using a loop strSql = "SELECT Products.ProductID, Products.ProductName, Products.UnitPrice FROM Products;" With Rs .MoveFirst Do Until irow = 30502 Worksheets("Sheet3").Range("A" & irow) = Rs("ProductID") Worksheets("Sheet3").Range("B" & irow) = Rs("ProductName") Worksheets("Sheet3").Range("C" & irow) = Rs("UnitPrice") 14.4 sec So I guess my premise to do as much as you can at the db level don't mean much. But, then what the hell, this is just a hobby. Have a happy thanksgiving TK |
Import extrenal data exceeds 65536
"TK" wrote ...
I thought you might find this interesting. I wrote a simple select query in Access "SELECT DISTINCTROW TOP 30500 Products.ProductID, Products.ProductName, Products.UnitPrice FROM Products;" Called the query from excel with the connection object, used CopyFromRecordset to write it into a sheet it took 1.3 sec Using the same select statement and CopyFromRecordset in excel to write to a sheet it took .59 sec I agree. Using a 60K rows by 100 column recordset, my results we 5.438 secs using CopyFromRecordset 9.250 secs using Application.Transpose(rs.GetRows) BTW when using I tried with 60K rows by 255 columns it was 14 secs for CopyFromReocrdset whereas GetRows caused an error, 'Not enough storage is available to complete this operation', for which I assume the limitation lies with the Transpose() function. Have a happy thanksgiving We don't celebrate it in my country (UK), we go mad at xmas instead <g. But I'll enjoy the day, thanks, and hope you will too. Jamie. -- |
Import extrenal data exceeds 65536
"TK" wrote ...
I see post all the time referring to the 65k limit in excel being referred to incorrectly. It is, as you know a column limitation. I posted the procedure merely to prove that point. I don't understand. The column limit (Jet) is 255 and your example uses 3 columns therefore the recordset would need to be 16+ million rows to hit the *column* limit. I assume you know about needing to transpose the GetRows array to match Excel's (row,column) format so what do you mean? Thanks again, Jamie. -- |
Import extrenal data exceeds 65536
Jamie
Thats is my point. Stacey Wrote ...... Excel has a row limitation of 64K here, and I've seen solutions for reading a text file having 65536 rows into Excel via a macro and adding a new sheet after the 65536th row. But I am .attempting to retrieve a result set (from MSQuery) which has more rows than Excel allows on one sheet "TK" wrote ... I see post all the time referring to the 65k limit in excel being referred to incorrectly. It is, as you know a column limitation. I posted the procedure merely to prove that point. 65k rows per column * (I thought it was 256 columns) but whose counting or 16,777,216 cells Also you might find this interesting, in playing around with a 88k recordset CopyFRomRecordset can handle the output. It does not fail it merly stops at 65,536 if you do not redirect the output as in the example below. '/Starts at A10 but does not stop at A10000 Worksheets("Sheet3").Range("A10:A1000").CopyFromRe cordset Rs '/If more than 65,536 records you must add a second line '/Starts at D10 but does not stop F??? Worksheets("Sheet3").Range("D10:f10000").CopyFromR ecordset Rs Good Luck TK "Jamie Collins" wrote: "TK" wrote ... I see post all the time referring to the 65k limit in excel being referred to incorrectly. It is, as you know a column limitation. I posted the procedure merely to prove that point. I don't understand. The column limit (Jet) is 255 and your example uses 3 columns therefore the recordset would need to be 16+ million rows to hit the *column* limit. I assume you know about needing to transpose the GetRows array to match Excel's (row,column) format so what do you mean? Thanks again, Jamie. -- |
Import extrenal data exceeds 65536
Jamie
"TK" wrote: Jamie Also you might find this interesting, in playing around with a 88k recordset CopyFRomRecordset can handle the output. It does not fail it merely stops at 65,536 if you do not redirect the output as in the example below. '/Starts at A10 but does not stop at A10000 Worksheets("Sheet3").Range("A10:A1000").CopyFromRe cordset Rs '/If more than 65,536 records you must add a second line '/Starts at D10 but does not stop F??? Worksheets("Sheet3").Range("D10:f10000").CopyFromR ecordset Rs The above is not false it is just not that accurate in further test the following also will bring 88.000, 3 field records into cols A B C and the balance into D E F Worksheets("Sheet3").Range("A:A,D:D").CopyFromReco rdset Rs Good Luck TK |
Import extrenal data exceeds 65536
"TK" wrote ...
you might find this interesting, in playing around with a 88k recordset CopyFRomRecordset can handle the output. It does not fail it merly stops at 65,536 Ah, this is something I hadn't appreciated i.e. that the current record stops at the maximum row limit rather than proceed to EOF. Thanks for that. I thought it was 256 columns but whose counting Excel has a 256 column limit. The column limit in a Jet *table* is 255 columns and AFAIK you cannot construct a *recordset* with more than 255 columns either when the source is Jet. Jamie. -- |
All times are GMT +1. The time now is 10:55 AM. |
Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
ExcelBanter.com