Home |
Search |
Today's Posts |
#1
Posted to microsoft.public.excel.programming,microsoft.public.excel.worksheetfunctinos
|
|||
|
|||
Optimizing performance of functions that access a database
The problem must be very typical for those who write custom Excel
functions that retrieve data from a database server. Even when a single call to such function works lightning fast (i.e. recalculation of a single cell goes unnoticeable by a human user), the fun begins when Excel tries to recalculate hundreds of such cells in a practically random order... One obvious solution is to program some sort of cache: at least, it would take care about workbooks that need to display multiple fields of each record. (For example, when workbook needs to show data from 100 records - e.g. 10 fields for each of those 100 records, - such cache could generate only 100 databased queries instead of the worst case of 10x100=1000.) I think such workbooks are pretty typical (at least in our workplace), and such performance improvements would be very appreciated by users... Probably, there are other, more sophisticated optimization tricks out there... However, my question is not how to program such cache or other tricks. My question is: does anyone know about any VBA library/component that could do something like abovementioned caching (and, perhaps, some other smart optimizations) out-of-the-box? In other words, before building it ourselves we would like to consider buying it. Thank you, Yarik. |
#2
Posted to microsoft.public.excel.programming,microsoft.public.excel.worksheetfunctinos
|
|||
|
|||
Optimizing performance of functions that access a database
Hi Yarik,
I use VBA Collections to cache intermediate results. I compose a string key of all identifying elements and just add the items to the collection. Each retrieve first looks in the collection to check if the record is already there and if not, accesses the external data and puts it in the collection. Of course this is all encapsulated in the function so in your spreadsheet you don't have to bother. Access is lightning fast and housekeeping of the collection is much easier than building tables yourself. You don't have to keep different tables for different types of data (as long as there is an identifier in the key) and if you're afraid of memory problems you just empty the collection every now and then and it will rebuild itself automatically when required. I've read that Directories can be even faster, but haven't had time yet to try. It probably depends on the nature of your keys and the pattern of access. If you try, please let us know the results. -- Kind regards, Niek Otten "Yarik" wrote in message oups.com... The problem must be very typical for those who write custom Excel functions that retrieve data from a database server. Even when a single call to such function works lightning fast (i.e. recalculation of a single cell goes unnoticeable by a human user), the fun begins when Excel tries to recalculate hundreds of such cells in a practically random order... One obvious solution is to program some sort of cache: at least, it would take care about workbooks that need to display multiple fields of each record. (For example, when workbook needs to show data from 100 records - e.g. 10 fields for each of those 100 records, - such cache could generate only 100 databased queries instead of the worst case of 10x100=1000.) I think such workbooks are pretty typical (at least in our workplace), and such performance improvements would be very appreciated by users... Probably, there are other, more sophisticated optimization tricks out there... However, my question is not how to program such cache or other tricks. My question is: does anyone know about any VBA library/component that could do something like abovementioned caching (and, perhaps, some other smart optimizations) out-of-the-box? In other words, before building it ourselves we would like to consider buying it. Thank you, Yarik. |
#3
Posted to microsoft.public.excel.programming,microsoft.public.excel.worksheetfunctinos
|
|||
|
|||
Optimizing performance of functions that access a database
I think Niek means Dictionaries from the Scripting Runtime
library, not Directories. In my experience dictionaries s/b well suited for this task. Collections can maintain ordered lists by inserting items on a specified position while Dictionaries cannot, but this should be no problem. Dictionaries are more versatile and faster than collections Advantages: check if a key exists (collection must be tested with on error resume) compare keys case sensitive (keys are not necessarily strings, so you can use (long) numbers change keys retrieve the arrays of the keys and items. (collection's items must be retrieved by enumerating the items) -- keepITcool | www.XLsupport.com | keepITcool chello nl | amsterdam Niek Otten wrote : Hi Yarik, SNIP I've read that Directories can be even faster, but haven't had time yet to try. It probably depends on the nature of your keys and the pattern of access. If you try, please let us know the results. |
#4
Posted to microsoft.public.excel.programming,microsoft.public.excel.worksheetfunctinos
|
|||
|
|||
Optimizing performance of functions that access a database
"keepITcool" wrote in message .com... I think Niek means Dictionaries from the Scripting Runtime library, not Directories. Absolutely right! In my experience dictionaries s/b well suited for this task. Collections can maintain ordered lists by inserting items on a specified position while Dictionaries cannot, but this should be no problem. If you use keys rather than pointers, you don't bother anymore where the items is located. Dictionaries are more versatile and faster than collections Advantages: check if a key exists (collection must be tested with on error resume) Indeed compare keys case sensitive (keys are not necessarily strings, so you can use (long) numbers change keys retrieve the arrays of the keys and items. (collection's items must be retrieved by enumerating the items) -- keepITcool I'll give it a try! -- Kind regards, Niek Otten | www.XLsupport.com | keepITcool chello nl | amsterdam Niek Otten wrote : Hi Yarik, SNIP I've read that Directories can be even faster, but haven't had time yet to try. It probably depends on the nature of your keys and the pattern of access. If you try, please let us know the results. |
#5
Posted to microsoft.public.excel.programming,microsoft.public.excel.worksheetfunctinos
|
|||
|
|||
Optimizing performance of functions that access a database
Thank you Niek.
However, my question was not how to implement caching myself. I was looking for a component that would provide such caching out of the box. I thought that the problem is so common that someone in this community might just know or have some sort of ADO-compliant component implementing such caching. For example, it could provide exactly the same interface as ADODB, but cache whole recordsets for configurable amount of time. Or something like that. This would allow to make our "legacy code" work faster almost without any rework (let alone new programming). Best regards, Yarik. |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Forum | |||
Update Excel Database via Access Main Database with a script | New Users to Excel | |||
Help Optimizing a Sheet to improve performance | Excel Worksheet Functions | |||
how do i set up employee performance database in excel | Excel Worksheet Functions | |||
Performance Issue with Database Connection | Excel Programming | |||
Performance tradeoffs of complex worksheet functions vs VBA? when to switch to a UDF? | Excel Programming |