Home |
Search |
Today's Posts |
#1
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Hello,
My boss asked me to create some macros in VBA to process a list of records in a spreadsheet. That is easy. I have to go out to websites and get data based on the number id in the spreadsheet and bring back data. I then take the data from the websites and upload the row in the spreadsheet for each number id. Ok, I got that to work well. Now he is giving me huge files to process with same routine. Some CSVs, some DBFs, some Word docs that will need to be parsed and a couple of spreadsheets. With all the different datasources, I thought I should load them into SQL Server and create one database. Each table would represent one of the datasources that I have been given. Done that. But the tables have 300,000 records in one, 200,000 in another and so on. Big tables and there are 8 tables in all. Now back to the Excel macro that he wanted me to create in the first place. I can convert the macro to get the number id from SQL Server and process the information and then even store it back to SQL Server. However, I am looking for best practices. Is using EXCEL VBA the right way to process hundreds of thousands of records in SQL Server? If so, what is the best way to do this? Create VBA code to bring over all records from one table, to excel and then process it , now that Excel can go over a million rows? Write VBA code to process blocks of number ids? Should I still be using EXCEL as an interface since now I am pulling number ids from a SQL server database and not a spreadsheet? Thanks for helping me clarify which way to go now with this project. |
#2
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
On Nov 15, 2:08*am, Webtechie
wrote: Hello, My boss asked me to create some macros in VBA to process a list of records in a spreadsheet. That is easy. *I have to go out to websites and get data based on the number id in the spreadsheet and bring back data. *I then take the data from the websites and upload the row in the spreadsheet for each number id. * Ok, I got that to work well. Now he is giving me huge files to process with same routine. *Some CSVs, some DBFs, some Word docs that will need to be parsed and a couple of spreadsheets. With all the different datasources, I thought I should load them into SQL Server and create one database. *Each table would represent one of the datasources that I have been given. Done that. *But the tables have 300,000 records in one, 200,000 in another and so on. *Big tables and there are 8 tables in all. Now back to the Excel macro that he wanted me to create in the first place. * I can convert the macro to get the number id from SQL Server and process the information and then even store it back to SQL Server. However, I am looking for best practices. *Is using EXCEL VBA the right way to process hundreds of thousands of records in SQL Server? If so, what is the best way to do this? *Create VBA code to bring over all records from one table, to excel and then process it , now that Excel can go over a million rows? Write VBA code to process blocks of number ids? Should I still be using EXCEL as an interface since now I am pulling number ids from a SQL server database and not a spreadsheet? Thanks for helping me clarify which way to go now with this project. Is there a reason you can't do what you want to by a stored procedure? |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Excel / Sql Server / Pivot Tables | Excel Discussion (Misc queries) | |||
Refreshing Pivot tables link to a SQL Server | Excel Programming | |||
pivot tables interface with queries | Excel Worksheet Functions | |||
QUI Expert: Excel-based User Interface or OO User Interface? | Excel Programming | |||
GUI Expert: Excel-based User Interface or OO User Interface? | Excel Programming |