Home |
Search |
Today's Posts |
#1
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]() Hi, I have a raw data file with very large size, for monthly data file, it could be up to 20 to 30MB. Inside the files there are a lot of columns and records that I don't need. So, I have written a macro that will delete unwanted columns and rows from the raw data files. When I run the marco, I realise that it is very slow when the file is getting larger as it will go to the records line by line to check whether it is wanted record. For example, if the records contains "ABC", the macro will delete the that whole record. Is there a way the I can program the macro to sort the data file and search for the first appeared "ABC" and the last appeared "ABC", then delete the whole range of the record? or is there any other better way that can make the process faster? Please advice.. Thanks a lot.. ------------------------------------------------ ~~ Message posted from http://www.ExcelTip.com/ ~~View and post usenet messages directly from http://www.ExcelForum.com/ |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Delete records when certain records have duplicate column data | New Users to Excel | |||
delete records with unique value | Excel Discussion (Misc queries) | |||
Sorting Records | Excel Discussion (Misc queries) | |||
Delete records using excel Macro | Excel Programming | |||
Delete records using excel Macro | Excel Programming |