Home |
Search |
Today's Posts |
#1
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Hi, I've set up my spreadsheet so that at the click of a button it will check
all of the entries for any duplicates. Unfortunately there is more and more data on the sheets each week and this slows down the duplicate checker. I assume this is because it is somehow filling up the computers memory, and the easiest way to sort it would be to dump the unneeded data it might be storing after each section of the search. My questions being: what is the best way to do this; is this the best way to sort it out; will this get rid of the variables i need to keep (ie if sheetscan = 3 then sheets("Progress") activate - to scroll through the sheets)????? Any help will be greatly appreciated. TIA Jonny |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Clearing cells without clearing formulas | Excel Discussion (Misc queries) | |||
XL 2007 - Out of Memory - memory leak/bug? | Excel Discussion (Misc queries) | |||
Clearing #N/A's in one go? | Excel Worksheet Functions | |||
Clearing Memory agter an update | Excel Programming | |||
The instruction at "0x65255ac9" referenced memory at "0x00000008". The memory could not be read. Clikc OK to terminate etc | Excel Programming |