Home |
Search |
Today's Posts |
#1
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
I'm working with a large data set in Excel, over 38,000 rows. I'm trying to
combine two such sets, then do some pivots, but because the combined table is too large, I need to eliminate some unecessary data. My approach so far has been to pull in the first set, then run through each row to find the rows I can eliminate, then delete each one. In this case, I'm trying to delete any row with "2007" as the year. Everything seems to be working, but once the code finds the 2007 rows, it's taking a very long time to run through the data. I've already turned off screen refresh, and set calculation mode to manual, but it's still taking a very long time. When I run though the code step by step, I don't detect any pauses or problems deleting the rows. It seems to run thorugh process very quickly. Is there any reason why this should take a long time? Is there a better approach for doing this? Thanks, Todd |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
respostX2 - deleting blank rows after macro-generated pivot table | Excel Programming | |||
respostX2 - deleting blank rows after macro-generated pivot table | Excel Programming | |||
deleting rows in large database (+20k rows) | Excel Programming | |||
Excel 2007 - Deleting Rows from a Table with AutoFilter | Excel Programming | |||
Pivot Table - Deleting data rows while maintaining them in the tot | Excel Worksheet Functions |