Home |
Search |
Today's Posts |
#1
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Hi there
I'm making many round-trips to SQL Server from Excel using a VBA module and an ADO connection. Since I'm connecting to only one SQL Server database, I keep the same connection open throughout the procedure, using the "conn.Close( )" and "Set conn = Nothing" statements only at the very end. Then, using this connection, I iterate through each record in the database creating a new recordset each time, i.e. "For i = 1 to 117394 | Set rs = conn.Execute( , ) | ... | Next i", etc. I know this must sound inefficient and slow, which it is, but I find it's even slower to pull a single recordset containing all records up-front and then loop through this using "Do While Not rs.EOF | ... | rs.MoveNext | Loop", since the database is very large (about 9GB), there are many complex joins between the tables (at least 40), and the requisite VBA code is a bit unwieldy to say the least. I encounter the following problem: for the first few thousand records the procedure is zippy, but thereafter the performance tapers off, becoming progressively slower and slower; CPU usage rises to 100% and Excel ends up absorbing all the available RAM and paging file memory. I presume that I am not releasing objects from memory, though within the loop that creates a recordset I also give the statements "rs.Close( )" and "Set rs = Nothing". Can you shed any light on where the memory-intensive part of the operation is, or perhaps I'm way off track. Best regards Loane |
#2
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Loane,
What is the task you're performing? It's not clear from your post what the endpoint of all your loops is. There may be a pure SQL approach (batch update?) which is faster. You're not compiling results from your queries into a string variable or something like that? Sounds like the performance pattern possible with that kind of activity... Tim. "Loane Sharp" wrote in message ... Hi there I'm making many round-trips to SQL Server from Excel using a VBA module and an ADO connection. Since I'm connecting to only one SQL Server database, I keep the same connection open throughout the procedure, using the "conn.Close( )" and "Set conn = Nothing" statements only at the very end. Then, using this connection, I iterate through each record in the database creating a new recordset each time, i.e. "For i = 1 to 117394 | Set rs = conn.Execute( , ) | ... | Next i", etc. I know this must sound inefficient and slow, which it is, but I find it's even slower to pull a single recordset containing all records up-front and then loop through this using "Do While Not rs.EOF | ... | rs.MoveNext | Loop", since the database is very large (about 9GB), there are many complex joins between the tables (at least 40), and the requisite VBA code is a bit unwieldy to say the least. I encounter the following problem: for the first few thousand records the procedure is zippy, but thereafter the performance tapers off, becoming progressively slower and slower; CPU usage rises to 100% and Excel ends up absorbing all the available RAM and paging file memory. I presume that I am not releasing objects from memory, though within the loop that creates a recordset I also give the statements "rs.Close( )" and "Set rs = Nothing". Can you shed any light on where the memory-intensive part of the operation is, or perhaps I'm way off track. Best regards Loane |
#3
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]() Hi Tim I have data on individual staff members working for a particular company. I'm trying to create performance distributions for them. The (individual and collective) performance calculations are not very straightforward, since they're drawn from bits and pieces of information in several databases. I haven't got a clue how I would do this using SQL statements, for example, and using Excel's range of built-in functions (statistical functions, for example) is simply easier. I've tried to be as parsimonious as possible, for instance creating recordsets with only the minimum data, so as to use the ADO connection less, but this hasn't helped (and this doesn't really seem to affect the performance materially anyway). Thanks for your suggestions, I'll check out batch updates and let you know how things turn out. (Incidentally, I'm not compiling query results into a string variable, far worse: I'm using CopyFromRecordset to copy the data into a worksheet!) Best regards Loane *** Sent via Developersdex http://www.developersdex.com *** Don't just participate in USENET...get rewarded for it! |
#4
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]() Loane Sharp wrote: they're drawn from bits and pieces of information in several databases. Are any of these 'databases' open Excel workbooks? http://support.microsoft.com/default...;en-us;Q319998 Jamie. -- |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
How do I create an xls for an OTR driver including trips and fees? | Excel Discussion (Misc queries) | |||
Excel 2007 high cpu usage, format cell, wrap text | Excel Discussion (Misc queries) | |||
Return a High Number | Excel Worksheet Functions | |||
attn: shirley - genuinely high quality news server - mag - (1/1) | Excel Worksheet Functions | |||
CPU Usage very high for Excel viewer | Excel Discussion (Misc queries) |