Home |
Search |
Today's Posts |
|
#1
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Hello,
Excel 2000 I have created a user report generator that gets data from a large (42,000 rows and 208 Columns) closed Excel file. I've optimized my code and creating the report takes 3 seconds AFTER I get the data. The problem is getting the data takes 30 seconds. I want to get the specific data and put it into an array in my code. Then it is very fast to manipulate. My code opens the file, sorts it (primarily by Customer Code) searches through a column that contains Customer Codes, once it finds the first occurence of the customer code, it grabs data from 32 columns (Not all together but scattered) and puts it into an array, then loops until the last occurrence of the customer code is found. Opening the file takes 15 seconds and gathering the data takes 15 seconds resulting in 30 seconds. If it will help, we can create a file that captures the rows that each company's data is in so we can get the data more efficiently. I've tried using ADO with limited success - it doesn't save time and doesn't collect all the data in columns that have varying data types. I've never used ADO and there might be better ways to do it with ADO. I've also tried creating links to the database file and putting the data into an array, but again this doesn't save time. Any suggestions will be appreciated. If you are really into it, I pasted my code below for your perusal and critical comments. Thanx !! dave Below is the code now in use: Option Base 1 Sub Getdata() Application.DisplayAlerts = False Application.screenupdating = False On Error GoTo ErrorHandler Dim i As Long Dim starttime Dim endtime Dim countloop ThisWorkbook.Activate 'Define which columns (in required order) to pull '**If adding/deleting columns MUST change NumberOfItems 'BEST TO ADD AT END - that way it won't affect other subs that look for specific location in MyArray MyGet = Array(2, 3, 21, 128, 155, 149, 150, 151, 152, 64, 15, 11, 12, 14, 148, 13, 156, 1, 5, 6, 7, 8, 16, 19, 142, 166, 170, 169, 171, 172, 173, 174) 'IMPORTANT: Set number of items that above array contains - used throughout this macro NumberOfItems = 32 ReDim myArray(20000, NumberOfItems) As Variant 'Set i to 1 to begin copy at first row which is column heading i = 1 'Open database file MyFileName as selected in opening macro Workbooks.Open Filename:=MyfileName Capturefilename = ActiveWorkbook.Name '+++++++++++++++++++++++++++++++++++++++++++++++++ +++++ 'VERY IMPORTANT: SORTS MUST BE IN CORRECT ORDER 'Logic is tied to company & book changes as well as others '+++++++++++++++++++++++++++++++++++++++++++++++++ ++++++ 'First Sort: FirmAcct, Order1, Tenor Range("A2").Select Selection.Sort Key1:=Range("S2"), Order1:=xlAscending, Key2:=Range("EL2") _ , Order2:=xlAscending, Key3:=Range("FM2"), Order3:=xlAscending, Header:=xlGuess, _ OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Second Sort Company, Book ' (Sort Company in descending order to get "AVM" (which is by far the largest) to bottom - makes looping more efficient Range("A2").Select Selection.Sort Key1:=Range("B2"), Order1:=xlDescending, Key2:=Range("C2") _ , Order2:=xlAscending, Header:= _ xlGuess, OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Put titles in first row of array For a = 1 To NumberOfItems myArray(1, a) = Cells(1, MyGet(a)) Next a 'Start at location "B2" Cells(2, 2).Select Set curcell = ActiveCell 'Cycle thru all cells in column "F" until blank cell is encountered ("") 'First Cell with a "" (No data or blanks - absolutely empty) will cause loop to stop Do While curcell.Value < "" If curcell.Value = MyCompany Then Do While curcell.Value = MyCompany i = i + 1 'Put selected columns in the current row into array For j = 1 To NumberOfItems myArray(i, j) = Cells(curcell.Row, MyGet(j)) Next j 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) Loop 'Once company changes no need to continue - data is sorted by company Exit Do Else GoTo NextItem End If NextItem: 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) curcell.Select Loop 'Counts how many items put in array less the first row (Titles) TotalHits = i - 1 'Close database file Windows(Capturefilename).Close 'Activate this workbook ThisWorkbook.Activate GoTo Skip 'Gives error message instead of vb error if file doesn't exist 'And stops macro execution ErrorHandler: MsgBox ("Error occured while trying to get database file" & Chr(13) & "Ensure file " & MyfileName & " exists") End Skip: If TotalHits < 1 Then Range("E2").Select MsgBox ("Did not find any data for selected company !" & Chr(13) & " - Please ensure company code is entered correctly-") End End If End Sub |
#2
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
If you are willing to break you data up into separate files, then those
wouldn't take nearly as long to open. I might suggest putting the data in ..csv files and using low level file IO to open them and parse out the data. http://support.microsoft.com/support...eio/fileio.asp File Access with Visual Basic® for Applications or if you want to preprocess you data, you might want to look at this: http://support.microsoft.com/default...42&Product=vb6 HOWTO: Write Data to a File Using WriteFile API -- Regards, Tom Ogilvy "Dave B" <Not.Available wrote in message ... Hello, Excel 2000 I have created a user report generator that gets data from a large (42,000 rows and 208 Columns) closed Excel file. I've optimized my code and creating the report takes 3 seconds AFTER I get the data. The problem is getting the data takes 30 seconds. I want to get the specific data and put it into an array in my code. Then it is very fast to manipulate. My code opens the file, sorts it (primarily by Customer Code) searches through a column that contains Customer Codes, once it finds the first occurence of the customer code, it grabs data from 32 columns (Not all together but scattered) and puts it into an array, then loops until the last occurrence of the customer code is found. Opening the file takes 15 seconds and gathering the data takes 15 seconds resulting in 30 seconds. If it will help, we can create a file that captures the rows that each company's data is in so we can get the data more efficiently. I've tried using ADO with limited success - it doesn't save time and doesn't collect all the data in columns that have varying data types. I've never used ADO and there might be better ways to do it with ADO. I've also tried creating links to the database file and putting the data into an array, but again this doesn't save time. Any suggestions will be appreciated. If you are really into it, I pasted my code below for your perusal and critical comments. Thanx !! dave Below is the code now in use: Option Base 1 Sub Getdata() Application.DisplayAlerts = False Application.screenupdating = False On Error GoTo ErrorHandler Dim i As Long Dim starttime Dim endtime Dim countloop ThisWorkbook.Activate 'Define which columns (in required order) to pull '**If adding/deleting columns MUST change NumberOfItems 'BEST TO ADD AT END - that way it won't affect other subs that look for specific location in MyArray MyGet = Array(2, 3, 21, 128, 155, 149, 150, 151, 152, 64, 15, 11, 12, 14, 148, 13, 156, 1, 5, 6, 7, 8, 16, 19, 142, 166, 170, 169, 171, 172, 173, 174) 'IMPORTANT: Set number of items that above array contains - used throughout this macro NumberOfItems = 32 ReDim myArray(20000, NumberOfItems) As Variant 'Set i to 1 to begin copy at first row which is column heading i = 1 'Open database file MyFileName as selected in opening macro Workbooks.Open Filename:=MyfileName Capturefilename = ActiveWorkbook.Name '+++++++++++++++++++++++++++++++++++++++++++++++++ +++++ 'VERY IMPORTANT: SORTS MUST BE IN CORRECT ORDER 'Logic is tied to company & book changes as well as others '+++++++++++++++++++++++++++++++++++++++++++++++++ ++++++ 'First Sort: FirmAcct, Order1, Tenor Range("A2").Select Selection.Sort Key1:=Range("S2"), Order1:=xlAscending, Key2:=Range("EL2") _ , Order2:=xlAscending, Key3:=Range("FM2"), Order3:=xlAscending, Header:=xlGuess, _ OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Second Sort Company, Book ' (Sort Company in descending order to get "AVM" (which is by far the largest) to bottom - makes looping more efficient Range("A2").Select Selection.Sort Key1:=Range("B2"), Order1:=xlDescending, Key2:=Range("C2") _ , Order2:=xlAscending, Header:= _ xlGuess, OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Put titles in first row of array For a = 1 To NumberOfItems myArray(1, a) = Cells(1, MyGet(a)) Next a 'Start at location "B2" Cells(2, 2).Select Set curcell = ActiveCell 'Cycle thru all cells in column "F" until blank cell is encountered ("") 'First Cell with a "" (No data or blanks - absolutely empty) will cause loop to stop Do While curcell.Value < "" If curcell.Value = MyCompany Then Do While curcell.Value = MyCompany i = i + 1 'Put selected columns in the current row into array For j = 1 To NumberOfItems myArray(i, j) = Cells(curcell.Row, MyGet(j)) Next j 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) Loop 'Once company changes no need to continue - data is sorted by company Exit Do Else GoTo NextItem End If NextItem: 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) curcell.Select Loop 'Counts how many items put in array less the first row (Titles) TotalHits = i - 1 'Close database file Windows(Capturefilename).Close 'Activate this workbook ThisWorkbook.Activate GoTo Skip 'Gives error message instead of vb error if file doesn't exist 'And stops macro execution ErrorHandler: MsgBox ("Error occured while trying to get database file" & Chr(13) & "Ensure file " & MyfileName & " exists") End Skip: If TotalHits < 1 Then Range("E2").Select MsgBox ("Did not find any data for selected company !" & Chr(13) & " - Please ensure company code is entered correctly-") End End If End Sub |
#3
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Thanks Tom.
I experimented with creating separate files per customer and it vastly improves the speed - 35 seconds down to 6 or less seconds. Since most customers have less than 2,500 rows of data, this seems like the best solution. This may not be an option since I am dependent on other developers to re-write their code to create separate files instead of one large file. There also may be other "reasons" why we need to keep the "big" file like other programs already link to it, etc. My task is to create an Excel file that the users will use to get their customer data. If it takes too long, the users don't like it. But, for what it has to do, I think it runs pretty well at this point. My boss wants me to call Microsoft and pay for someone to suggest other ways to do this that may be faster or to verify that it is as fast as can be given the constraints. I will look into the links you supplied for more ideas - thanks very much for your help and advice. dave "Tom Ogilvy" wrote in message ... If you are willing to break you data up into separate files, then those wouldn't take nearly as long to open. I might suggest putting the data in .csv files and using low level file IO to open them and parse out the data. http://support.microsoft.com/support...eio/fileio.asp File Access with Visual Basic® for Applications or if you want to preprocess you data, you might want to look at this: http://support.microsoft.com/default...42&Product=vb6 HOWTO: Write Data to a File Using WriteFile API -- Regards, Tom Ogilvy "Dave B" <Not.Available wrote in message ... Hello, Excel 2000 I have created a user report generator that gets data from a large (42,000 rows and 208 Columns) closed Excel file. I've optimized my code and creating the report takes 3 seconds AFTER I get the data. The problem is getting the data takes 30 seconds. I want to get the specific data and put it into an array in my code. Then it is very fast to manipulate. My code opens the file, sorts it (primarily by Customer Code) searches through a column that contains Customer Codes, once it finds the first occurence of the customer code, it grabs data from 32 columns (Not all together but scattered) and puts it into an array, then loops until the last occurrence of the customer code is found. Opening the file takes 15 seconds and gathering the data takes 15 seconds resulting in 30 seconds. If it will help, we can create a file that captures the rows that each company's data is in so we can get the data more efficiently. I've tried using ADO with limited success - it doesn't save time and doesn't collect all the data in columns that have varying data types. I've never used ADO and there might be better ways to do it with ADO. I've also tried creating links to the database file and putting the data into an array, but again this doesn't save time. Any suggestions will be appreciated. If you are really into it, I pasted my code below for your perusal and critical comments. Thanx !! dave Below is the code now in use: Option Base 1 Sub Getdata() Application.DisplayAlerts = False Application.screenupdating = False On Error GoTo ErrorHandler Dim i As Long Dim starttime Dim endtime Dim countloop ThisWorkbook.Activate 'Define which columns (in required order) to pull '**If adding/deleting columns MUST change NumberOfItems 'BEST TO ADD AT END - that way it won't affect other subs that look for specific location in MyArray MyGet = Array(2, 3, 21, 128, 155, 149, 150, 151, 152, 64, 15, 11, 12, 14, 148, 13, 156, 1, 5, 6, 7, 8, 16, 19, 142, 166, 170, 169, 171, 172, 173, 174) 'IMPORTANT: Set number of items that above array contains - used throughout this macro NumberOfItems = 32 ReDim myArray(20000, NumberOfItems) As Variant 'Set i to 1 to begin copy at first row which is column heading i = 1 'Open database file MyFileName as selected in opening macro Workbooks.Open Filename:=MyfileName Capturefilename = ActiveWorkbook.Name '+++++++++++++++++++++++++++++++++++++++++++++++++ +++++ 'VERY IMPORTANT: SORTS MUST BE IN CORRECT ORDER 'Logic is tied to company & book changes as well as others '+++++++++++++++++++++++++++++++++++++++++++++++++ ++++++ 'First Sort: FirmAcct, Order1, Tenor Range("A2").Select Selection.Sort Key1:=Range("S2"), Order1:=xlAscending, Key2:=Range("EL2") _ , Order2:=xlAscending, Key3:=Range("FM2"), Order3:=xlAscending, Header:=xlGuess, _ OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Second Sort Company, Book ' (Sort Company in descending order to get "AVM" (which is by far the largest) to bottom - makes looping more efficient Range("A2").Select Selection.Sort Key1:=Range("B2"), Order1:=xlDescending, Key2:=Range("C2") _ , Order2:=xlAscending, Header:= _ xlGuess, OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Put titles in first row of array For a = 1 To NumberOfItems myArray(1, a) = Cells(1, MyGet(a)) Next a 'Start at location "B2" Cells(2, 2).Select Set curcell = ActiveCell 'Cycle thru all cells in column "F" until blank cell is encountered ("") 'First Cell with a "" (No data or blanks - absolutely empty) will cause loop to stop Do While curcell.Value < "" If curcell.Value = MyCompany Then Do While curcell.Value = MyCompany i = i + 1 'Put selected columns in the current row into array For j = 1 To NumberOfItems myArray(i, j) = Cells(curcell.Row, MyGet(j)) Next j 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) Loop 'Once company changes no need to continue - data is sorted by company Exit Do Else GoTo NextItem End If NextItem: 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) curcell.Select Loop 'Counts how many items put in array less the first row (Titles) TotalHits = i - 1 'Close database file Windows(Capturefilename).Close 'Activate this workbook ThisWorkbook.Activate GoTo Skip 'Gives error message instead of vb error if file doesn't exist 'And stops macro execution ErrorHandler: MsgBox ("Error occured while trying to get database file" & Chr(13) & "Ensure file " & MyfileName & " exists") End Skip: If TotalHits < 1 Then Range("E2").Select MsgBox ("Did not find any data for selected company !" & Chr(13) & " - Please ensure company code is entered correctly-") End End If End Sub |
#4
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Maybe put the data into Access and use ADO. Might be faster. You can
transfer an ado record set to an array in one step. I believe it will be transposed from what you would expect. Regards, Tom Ogilvy "Dave B" <Not.Available wrote in message ... Thanks Tom. I experimented with creating separate files per customer and it vastly improves the speed - 35 seconds down to 6 or less seconds. Since most customers have less than 2,500 rows of data, this seems like the best solution. This may not be an option since I am dependent on other developers to re-write their code to create separate files instead of one large file. There also may be other "reasons" why we need to keep the "big" file like other programs already link to it, etc. My task is to create an Excel file that the users will use to get their customer data. If it takes too long, the users don't like it. But, for what it has to do, I think it runs pretty well at this point. My boss wants me to call Microsoft and pay for someone to suggest other ways to do this that may be faster or to verify that it is as fast as can be given the constraints. I will look into the links you supplied for more ideas - thanks very much for your help and advice. dave "Tom Ogilvy" wrote in message ... If you are willing to break you data up into separate files, then those wouldn't take nearly as long to open. I might suggest putting the data in .csv files and using low level file IO to open them and parse out the data. http://support.microsoft.com/support...eio/fileio.asp File Access with Visual Basic® for Applications or if you want to preprocess you data, you might want to look at this: http://support.microsoft.com/default...42&Product=vb6 HOWTO: Write Data to a File Using WriteFile API -- Regards, Tom Ogilvy "Dave B" <Not.Available wrote in message ... Hello, Excel 2000 I have created a user report generator that gets data from a large (42,000 rows and 208 Columns) closed Excel file. I've optimized my code and creating the report takes 3 seconds AFTER I get the data. The problem is getting the data takes 30 seconds. I want to get the specific data and put it into an array in my code. Then it is very fast to manipulate. My code opens the file, sorts it (primarily by Customer Code) searches through a column that contains Customer Codes, once it finds the first occurence of the customer code, it grabs data from 32 columns (Not all together but scattered) and puts it into an array, then loops until the last occurrence of the customer code is found. Opening the file takes 15 seconds and gathering the data takes 15 seconds resulting in 30 seconds. If it will help, we can create a file that captures the rows that each company's data is in so we can get the data more efficiently. I've tried using ADO with limited success - it doesn't save time and doesn't collect all the data in columns that have varying data types. I've never used ADO and there might be better ways to do it with ADO. I've also tried creating links to the database file and putting the data into an array, but again this doesn't save time. Any suggestions will be appreciated. If you are really into it, I pasted my code below for your perusal and critical comments. Thanx !! dave Below is the code now in use: Option Base 1 Sub Getdata() Application.DisplayAlerts = False Application.screenupdating = False On Error GoTo ErrorHandler Dim i As Long Dim starttime Dim endtime Dim countloop ThisWorkbook.Activate 'Define which columns (in required order) to pull '**If adding/deleting columns MUST change NumberOfItems 'BEST TO ADD AT END - that way it won't affect other subs that look for specific location in MyArray MyGet = Array(2, 3, 21, 128, 155, 149, 150, 151, 152, 64, 15, 11, 12, 14, 148, 13, 156, 1, 5, 6, 7, 8, 16, 19, 142, 166, 170, 169, 171, 172, 173, 174) 'IMPORTANT: Set number of items that above array contains - used throughout this macro NumberOfItems = 32 ReDim myArray(20000, NumberOfItems) As Variant 'Set i to 1 to begin copy at first row which is column heading i = 1 'Open database file MyFileName as selected in opening macro Workbooks.Open Filename:=MyfileName Capturefilename = ActiveWorkbook.Name '+++++++++++++++++++++++++++++++++++++++++++++++++ +++++ 'VERY IMPORTANT: SORTS MUST BE IN CORRECT ORDER 'Logic is tied to company & book changes as well as others '+++++++++++++++++++++++++++++++++++++++++++++++++ ++++++ 'First Sort: FirmAcct, Order1, Tenor Range("A2").Select Selection.Sort Key1:=Range("S2"), Order1:=xlAscending, Key2:=Range("EL2") _ , Order2:=xlAscending, Key3:=Range("FM2"), Order3:=xlAscending, Header:=xlGuess, _ OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Second Sort Company, Book ' (Sort Company in descending order to get "AVM" (which is by far the largest) to bottom - makes looping more efficient Range("A2").Select Selection.Sort Key1:=Range("B2"), Order1:=xlDescending, Key2:=Range("C2") _ , Order2:=xlAscending, Header:= _ xlGuess, OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Put titles in first row of array For a = 1 To NumberOfItems myArray(1, a) = Cells(1, MyGet(a)) Next a 'Start at location "B2" Cells(2, 2).Select Set curcell = ActiveCell 'Cycle thru all cells in column "F" until blank cell is encountered ("") 'First Cell with a "" (No data or blanks - absolutely empty) will cause loop to stop Do While curcell.Value < "" If curcell.Value = MyCompany Then Do While curcell.Value = MyCompany i = i + 1 'Put selected columns in the current row into array For j = 1 To NumberOfItems myArray(i, j) = Cells(curcell.Row, MyGet(j)) Next j 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) Loop 'Once company changes no need to continue - data is sorted by company Exit Do Else GoTo NextItem End If NextItem: 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) curcell.Select Loop 'Counts how many items put in array less the first row (Titles) TotalHits = i - 1 'Close database file Windows(Capturefilename).Close 'Activate this workbook ThisWorkbook.Activate GoTo Skip 'Gives error message instead of vb error if file doesn't exist 'And stops macro execution ErrorHandler: MsgBox ("Error occured while trying to get database file" & Chr(13) & "Ensure file " & MyfileName & " exists") End Skip: If TotalHits < 1 Then Range("E2").Select MsgBox ("Did not find any data for selected company !" & Chr(13) & " - Please ensure company code is entered correctly-") End End If End Sub |
#5
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Tom,
I would have created the report generator in Access but was directed to NOT use Access. In fact, when I first started this project I loaded Access onto my computer and was working away when the person heading this project came by and almost threw up. They were adamant that I not introduce new software from what they are used to. So, I must do it all in Excel. Can't even use a report generator although now I've just about written one in vb to do what I've been requested to do. I must say that I've learned a lot so it was a good experience. I'm now mildly stressed-out because I can't get it any faster with the constrainsts I've been given. I guess they'll be satisfied once I get a Microsoft expert to say it's as fast as it can be given the current environment. Thanks again for your input - when I first started this job a year ago you pointed me towards the "On-Time" function in Excel (needed DDE links to update while macro was running). That made me a hero because it paved the way to convert over 100 LOTUS files to Excel. dave "Tom Ogilvy" wrote in message ... Maybe put the data into Access and use ADO. Might be faster. You can transfer an ado record set to an array in one step. I believe it will be transposed from what you would expect. Regards, Tom Ogilvy "Dave B" <Not.Available wrote in message ... Thanks Tom. I experimented with creating separate files per customer and it vastly improves the speed - 35 seconds down to 6 or less seconds. Since most customers have less than 2,500 rows of data, this seems like the best solution. This may not be an option since I am dependent on other developers to re-write their code to create separate files instead of one large file. There also may be other "reasons" why we need to keep the "big" file like other programs already link to it, etc. My task is to create an Excel file that the users will use to get their customer data. If it takes too long, the users don't like it. But, for what it has to do, I think it runs pretty well at this point. My boss wants me to call Microsoft and pay for someone to suggest other ways to do this that may be faster or to verify that it is as fast as can be given the constraints. I will look into the links you supplied for more ideas - thanks very much for your help and advice. dave "Tom Ogilvy" wrote in message ... If you are willing to break you data up into separate files, then those wouldn't take nearly as long to open. I might suggest putting the data in .csv files and using low level file IO to open them and parse out the data. http://support.microsoft.com/support...eio/fileio.asp File Access with Visual Basic® for Applications or if you want to preprocess you data, you might want to look at this: http://support.microsoft.com/default...42&Product=vb6 HOWTO: Write Data to a File Using WriteFile API -- Regards, Tom Ogilvy "Dave B" <Not.Available wrote in message ... Hello, Excel 2000 I have created a user report generator that gets data from a large (42,000 rows and 208 Columns) closed Excel file. I've optimized my code and creating the report takes 3 seconds AFTER I get the data. The problem is getting the data takes 30 seconds. I want to get the specific data and put it into an array in my code. Then it is very fast to manipulate. My code opens the file, sorts it (primarily by Customer Code) searches through a column that contains Customer Codes, once it finds the first occurence of the customer code, it grabs data from 32 columns (Not all together but scattered) and puts it into an array, then loops until the last occurrence of the customer code is found. Opening the file takes 15 seconds and gathering the data takes 15 seconds resulting in 30 seconds. If it will help, we can create a file that captures the rows that each company's data is in so we can get the data more efficiently. I've tried using ADO with limited success - it doesn't save time and doesn't collect all the data in columns that have varying data types. I've never used ADO and there might be better ways to do it with ADO. I've also tried creating links to the database file and putting the data into an array, but again this doesn't save time. Any suggestions will be appreciated. If you are really into it, I pasted my code below for your perusal and critical comments. Thanx !! dave Below is the code now in use: Option Base 1 Sub Getdata() Application.DisplayAlerts = False Application.screenupdating = False On Error GoTo ErrorHandler Dim i As Long Dim starttime Dim endtime Dim countloop ThisWorkbook.Activate 'Define which columns (in required order) to pull '**If adding/deleting columns MUST change NumberOfItems 'BEST TO ADD AT END - that way it won't affect other subs that look for specific location in MyArray MyGet = Array(2, 3, 21, 128, 155, 149, 150, 151, 152, 64, 15, 11, 12, 14, 148, 13, 156, 1, 5, 6, 7, 8, 16, 19, 142, 166, 170, 169, 171, 172, 173, 174) 'IMPORTANT: Set number of items that above array contains - used throughout this macro NumberOfItems = 32 ReDim myArray(20000, NumberOfItems) As Variant 'Set i to 1 to begin copy at first row which is column heading i = 1 'Open database file MyFileName as selected in opening macro Workbooks.Open Filename:=MyfileName Capturefilename = ActiveWorkbook.Name '+++++++++++++++++++++++++++++++++++++++++++++++++ +++++ 'VERY IMPORTANT: SORTS MUST BE IN CORRECT ORDER 'Logic is tied to company & book changes as well as others '+++++++++++++++++++++++++++++++++++++++++++++++++ ++++++ 'First Sort: FirmAcct, Order1, Tenor Range("A2").Select Selection.Sort Key1:=Range("S2"), Order1:=xlAscending, Key2:=Range("EL2") _ , Order2:=xlAscending, Key3:=Range("FM2"), Order3:=xlAscending, Header:=xlGuess, _ OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Second Sort Company, Book ' (Sort Company in descending order to get "AVM" (which is by far the largest) to bottom - makes looping more efficient Range("A2").Select Selection.Sort Key1:=Range("B2"), Order1:=xlDescending, Key2:=Range("C2") _ , Order2:=xlAscending, Header:= _ xlGuess, OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Put titles in first row of array For a = 1 To NumberOfItems myArray(1, a) = Cells(1, MyGet(a)) Next a 'Start at location "B2" Cells(2, 2).Select Set curcell = ActiveCell 'Cycle thru all cells in column "F" until blank cell is encountered ("") 'First Cell with a "" (No data or blanks - absolutely empty) will cause loop to stop Do While curcell.Value < "" If curcell.Value = MyCompany Then Do While curcell.Value = MyCompany i = i + 1 'Put selected columns in the current row into array For j = 1 To NumberOfItems myArray(i, j) = Cells(curcell.Row, MyGet(j)) Next j 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) Loop 'Once company changes no need to continue - data is sorted by company Exit Do Else GoTo NextItem End If NextItem: 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) curcell.Select Loop 'Counts how many items put in array less the first row (Titles) TotalHits = i - 1 'Close database file Windows(Capturefilename).Close 'Activate this workbook ThisWorkbook.Activate GoTo Skip 'Gives error message instead of vb error if file doesn't exist 'And stops macro execution ErrorHandler: MsgBox ("Error occured while trying to get database file" & Chr(13) & "Ensure file " & MyfileName & " exists") End Skip: If TotalHits < 1 Then Range("E2").Select MsgBox ("Did not find any data for selected company !" & Chr(13) & " - Please ensure company code is entered correctly-") End End If End Sub |
#6
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
ADO should be able to read an access mdb file without having access on the
machine. You can also link to is using a query table from excel or from a pivottable (neither require code, but can be coded to change what records are returned). I haven't done much linking to external data using these methods so I can't say what the performance is, but what I have done has been very fast. Of course, to work with the data outside Excel would require access, but this may not be a constraint. -- Regards, Tom Ogilvy "Dave B" <Not.Available wrote in message ... Tom, I would have created the report generator in Access but was directed to NOT use Access. In fact, when I first started this project I loaded Access onto my computer and was working away when the person heading this project came by and almost threw up. They were adamant that I not introduce new software from what they are used to. So, I must do it all in Excel. Can't even use a report generator although now I've just about written one in vb to do what I've been requested to do. I must say that I've learned a lot so it was a good experience. I'm now mildly stressed-out because I can't get it any faster with the constrainsts I've been given. I guess they'll be satisfied once I get a Microsoft expert to say it's as fast as it can be given the current environment. Thanks again for your input - when I first started this job a year ago you pointed me towards the "On-Time" function in Excel (needed DDE links to update while macro was running). That made me a hero because it paved the way to convert over 100 LOTUS files to Excel. dave "Tom Ogilvy" wrote in message ... Maybe put the data into Access and use ADO. Might be faster. You can transfer an ado record set to an array in one step. I believe it will be transposed from what you would expect. Regards, Tom Ogilvy "Dave B" <Not.Available wrote in message ... Thanks Tom. I experimented with creating separate files per customer and it vastly improves the speed - 35 seconds down to 6 or less seconds. Since most customers have less than 2,500 rows of data, this seems like the best solution. This may not be an option since I am dependent on other developers to re-write their code to create separate files instead of one large file. There also may be other "reasons" why we need to keep the "big" file like other programs already link to it, etc. My task is to create an Excel file that the users will use to get their customer data. If it takes too long, the users don't like it. But, for what it has to do, I think it runs pretty well at this point. My boss wants me to call Microsoft and pay for someone to suggest other ways to do this that may be faster or to verify that it is as fast as can be given the constraints. I will look into the links you supplied for more ideas - thanks very much for your help and advice. dave "Tom Ogilvy" wrote in message ... If you are willing to break you data up into separate files, then those wouldn't take nearly as long to open. I might suggest putting the data in .csv files and using low level file IO to open them and parse out the data. http://support.microsoft.com/support...eio/fileio.asp File Access with Visual Basic® for Applications or if you want to preprocess you data, you might want to look at this: http://support.microsoft.com/default...42&Product=vb6 HOWTO: Write Data to a File Using WriteFile API -- Regards, Tom Ogilvy "Dave B" <Not.Available wrote in message ... Hello, Excel 2000 I have created a user report generator that gets data from a large (42,000 rows and 208 Columns) closed Excel file. I've optimized my code and creating the report takes 3 seconds AFTER I get the data. The problem is getting the data takes 30 seconds. I want to get the specific data and put it into an array in my code. Then it is very fast to manipulate. My code opens the file, sorts it (primarily by Customer Code) searches through a column that contains Customer Codes, once it finds the first occurence of the customer code, it grabs data from 32 columns (Not all together but scattered) and puts it into an array, then loops until the last occurrence of the customer code is found. Opening the file takes 15 seconds and gathering the data takes 15 seconds resulting in 30 seconds. If it will help, we can create a file that captures the rows that each company's data is in so we can get the data more efficiently. I've tried using ADO with limited success - it doesn't save time and doesn't collect all the data in columns that have varying data types. I've never used ADO and there might be better ways to do it with ADO. I've also tried creating links to the database file and putting the data into an array, but again this doesn't save time. Any suggestions will be appreciated. If you are really into it, I pasted my code below for your perusal and critical comments. Thanx !! dave Below is the code now in use: Option Base 1 Sub Getdata() Application.DisplayAlerts = False Application.screenupdating = False On Error GoTo ErrorHandler Dim i As Long Dim starttime Dim endtime Dim countloop ThisWorkbook.Activate 'Define which columns (in required order) to pull '**If adding/deleting columns MUST change NumberOfItems 'BEST TO ADD AT END - that way it won't affect other subs that look for specific location in MyArray MyGet = Array(2, 3, 21, 128, 155, 149, 150, 151, 152, 64, 15, 11, 12, 14, 148, 13, 156, 1, 5, 6, 7, 8, 16, 19, 142, 166, 170, 169, 171, 172, 173, 174) 'IMPORTANT: Set number of items that above array contains - used throughout this macro NumberOfItems = 32 ReDim myArray(20000, NumberOfItems) As Variant 'Set i to 1 to begin copy at first row which is column heading i = 1 'Open database file MyFileName as selected in opening macro Workbooks.Open Filename:=MyfileName Capturefilename = ActiveWorkbook.Name '+++++++++++++++++++++++++++++++++++++++++++++++++ +++++ 'VERY IMPORTANT: SORTS MUST BE IN CORRECT ORDER 'Logic is tied to company & book changes as well as others '+++++++++++++++++++++++++++++++++++++++++++++++++ ++++++ 'First Sort: FirmAcct, Order1, Tenor Range("A2").Select Selection.Sort Key1:=Range("S2"), Order1:=xlAscending, Key2:=Range("EL2") _ , Order2:=xlAscending, Key3:=Range("FM2"), Order3:=xlAscending, Header:=xlGuess, _ OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Second Sort Company, Book ' (Sort Company in descending order to get "AVM" (which is by far the largest) to bottom - makes looping more efficient Range("A2").Select Selection.Sort Key1:=Range("B2"), Order1:=xlDescending, Key2:=Range("C2") _ , Order2:=xlAscending, Header:= _ xlGuess, OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Put titles in first row of array For a = 1 To NumberOfItems myArray(1, a) = Cells(1, MyGet(a)) Next a 'Start at location "B2" Cells(2, 2).Select Set curcell = ActiveCell 'Cycle thru all cells in column "F" until blank cell is encountered ("") 'First Cell with a "" (No data or blanks - absolutely empty) will cause loop to stop Do While curcell.Value < "" If curcell.Value = MyCompany Then Do While curcell.Value = MyCompany i = i + 1 'Put selected columns in the current row into array For j = 1 To NumberOfItems myArray(i, j) = Cells(curcell.Row, MyGet(j)) Next j 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) Loop 'Once company changes no need to continue - data is sorted by company Exit Do Else GoTo NextItem End If NextItem: 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) curcell.Select Loop 'Counts how many items put in array less the first row (Titles) TotalHits = i - 1 'Close database file Windows(Capturefilename).Close 'Activate this workbook ThisWorkbook.Activate GoTo Skip 'Gives error message instead of vb error if file doesn't exist 'And stops macro execution ErrorHandler: MsgBox ("Error occured while trying to get database file" & Chr(13) & "Ensure file " & MyfileName & " exists") End Skip: If TotalHits < 1 Then Range("E2").Select MsgBox ("Did not find any data for selected company !" & Chr(13) & " - Please ensure company code is entered correctly-") End End If End Sub |
#7
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Hi Dave,
I'm not looking want to confuse the issue, but have you tried dataget external data, database query, using your Excel workbook as the datasource. You can sort the data as it is being pulled from the closed workbook. I just did a quick test and Excel pulled 600 rows &12 columns of data in under .2 seconds. This will use MSQuery and write a SQL query. Similar to ADO but all Excel. Record a macro while running the query and Excel will write the code or save the query and it can be edited later. -- John johnf202 at hotmail dot com |
#8
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Hi Dave,
Although ADO is sometimes difficult to use in certain situations (as you have found), it may end up being much faster than what you are using now. If the data isn't too sensitive, feel free to email the workbook to me if you'd like. I'll see what I can do with it using ADO. My email: jake <at longhead [period] com -- Regards, Jake Marx www.longhead.com [please keep replies in the newsgroup - email address unmonitored] Dave B wrote: Hello, Excel 2000 I have created a user report generator that gets data from a large (42,000 rows and 208 Columns) closed Excel file. I've optimized my code and creating the report takes 3 seconds AFTER I get the data. The problem is getting the data takes 30 seconds. I want to get the specific data and put it into an array in my code. Then it is very fast to manipulate. My code opens the file, sorts it (primarily by Customer Code) searches through a column that contains Customer Codes, once it finds the first occurence of the customer code, it grabs data from 32 columns (Not all together but scattered) and puts it into an array, then loops until the last occurrence of the customer code is found. Opening the file takes 15 seconds and gathering the data takes 15 seconds resulting in 30 seconds. If it will help, we can create a file that captures the rows that each company's data is in so we can get the data more efficiently. I've tried using ADO with limited success - it doesn't save time and doesn't collect all the data in columns that have varying data types. I've never used ADO and there might be better ways to do it with ADO. I've also tried creating links to the database file and putting the data into an array, but again this doesn't save time. Any suggestions will be appreciated. If you are really into it, I pasted my code below for your perusal and critical comments. Thanx !! dave Below is the code now in use: Option Base 1 Sub Getdata() Application.DisplayAlerts = False Application.screenupdating = False On Error GoTo ErrorHandler Dim i As Long Dim starttime Dim endtime Dim countloop ThisWorkbook.Activate 'Define which columns (in required order) to pull '**If adding/deleting columns MUST change NumberOfItems 'BEST TO ADD AT END - that way it won't affect other subs that look for specific location in MyArray MyGet = Array(2, 3, 21, 128, 155, 149, 150, 151, 152, 64, 15, 11, 12, 14, 148, 13, 156, 1, 5, 6, 7, 8, 16, 19, 142, 166, 170, 169, 171, 172, 173, 174) 'IMPORTANT: Set number of items that above array contains - used throughout this macro NumberOfItems = 32 ReDim myArray(20000, NumberOfItems) As Variant 'Set i to 1 to begin copy at first row which is column heading i = 1 'Open database file MyFileName as selected in opening macro Workbooks.Open Filename:=MyfileName Capturefilename = ActiveWorkbook.Name '+++++++++++++++++++++++++++++++++++++++++++++++++ +++++ 'VERY IMPORTANT: SORTS MUST BE IN CORRECT ORDER 'Logic is tied to company & book changes as well as others '+++++++++++++++++++++++++++++++++++++++++++++++++ ++++++ 'First Sort: FirmAcct, Order1, Tenor Range("A2").Select Selection.Sort Key1:=Range("S2"), Order1:=xlAscending, Key2:=Range("EL2") _ , Order2:=xlAscending, Key3:=Range("FM2"), Order3:=xlAscending, Header:=xlGuess, _ OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Second Sort Company, Book ' (Sort Company in descending order to get "AVM" (which is by far the largest) to bottom - makes looping more efficient Range("A2").Select Selection.Sort Key1:=Range("B2"), Order1:=xlDescending, Key2:=Range("C2") _ , Order2:=xlAscending, Header:= _ xlGuess, OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Put titles in first row of array For a = 1 To NumberOfItems myArray(1, a) = Cells(1, MyGet(a)) Next a 'Start at location "B2" Cells(2, 2).Select Set curcell = ActiveCell 'Cycle thru all cells in column "F" until blank cell is encountered ("") 'First Cell with a "" (No data or blanks - absolutely empty) will cause loop to stop Do While curcell.Value < "" If curcell.Value = MyCompany Then Do While curcell.Value = MyCompany i = i + 1 'Put selected columns in the current row into array For j = 1 To NumberOfItems myArray(i, j) = Cells(curcell.Row, MyGet(j)) Next j 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) Loop 'Once company changes no need to continue - data is sorted by company Exit Do Else GoTo NextItem End If NextItem: 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) curcell.Select Loop 'Counts how many items put in array less the first row (Titles) TotalHits = i - 1 'Close database file Windows(Capturefilename).Close 'Activate this workbook ThisWorkbook.Activate GoTo Skip 'Gives error message instead of vb error if file doesn't exist 'And stops macro execution ErrorHandler: MsgBox ("Error occured while trying to get database file" & Chr(13) & "Ensure file " & MyfileName & " exists") End Skip: If TotalHits < 1 Then Range("E2").Select MsgBox ("Did not find any data for selected company !" & Chr(13) & " - Please ensure company code is entered correctly-") End End If End Sub |
#9
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Dave,
I have followed your thread with Tom, and IMVHO, I think processing this in less than a minute isn't too bad at all. You mentioned: I've also tried creating links to the database file and putting the data into an array, but again this doesn't save time. Have you tried reading this into a VBA array, and doing sorts and manipulations on the VBA array instead of on the sheet? My guess is that 15 of your seconds are related to the multiple read statements. AlexJ "Dave B" <Not.Available wrote in message ... Hello, Excel 2000 I have created a user report generator that gets data from a large (42,000 rows and 208 Columns) closed Excel file. I've optimized my code and creating the report takes 3 seconds AFTER I get the data. The problem is getting the data takes 30 seconds. I want to get the specific data and put it into an array in my code. Then it is very fast to manipulate. My code opens the file, sorts it (primarily by Customer Code) searches through a column that contains Customer Codes, once it finds the first occurence of the customer code, it grabs data from 32 columns (Not all together but scattered) and puts it into an array, then loops until the last occurrence of the customer code is found. Opening the file takes 15 seconds and gathering the data takes 15 seconds resulting in 30 seconds. If it will help, we can create a file that captures the rows that each company's data is in so we can get the data more efficiently. I've tried using ADO with limited success - it doesn't save time and doesn't collect all the data in columns that have varying data types. I've never used ADO and there might be better ways to do it with ADO. I've also tried creating links to the database file and putting the data into an array, but again this doesn't save time. Any suggestions will be appreciated. If you are really into it, I pasted my code below for your perusal and critical comments. Thanx !! dave Below is the code now in use: Option Base 1 Sub Getdata() Application.DisplayAlerts = False Application.screenupdating = False On Error GoTo ErrorHandler Dim i As Long Dim starttime Dim endtime Dim countloop ThisWorkbook.Activate 'Define which columns (in required order) to pull '**If adding/deleting columns MUST change NumberOfItems 'BEST TO ADD AT END - that way it won't affect other subs that look for specific location in MyArray MyGet = Array(2, 3, 21, 128, 155, 149, 150, 151, 152, 64, 15, 11, 12, 14, 148, 13, 156, 1, 5, 6, 7, 8, 16, 19, 142, 166, 170, 169, 171, 172, 173, 174) 'IMPORTANT: Set number of items that above array contains - used throughout this macro NumberOfItems = 32 ReDim myArray(20000, NumberOfItems) As Variant 'Set i to 1 to begin copy at first row which is column heading i = 1 'Open database file MyFileName as selected in opening macro Workbooks.Open Filename:=MyfileName Capturefilename = ActiveWorkbook.Name '+++++++++++++++++++++++++++++++++++++++++++++++++ +++++ 'VERY IMPORTANT: SORTS MUST BE IN CORRECT ORDER 'Logic is tied to company & book changes as well as others '+++++++++++++++++++++++++++++++++++++++++++++++++ ++++++ 'First Sort: FirmAcct, Order1, Tenor Range("A2").Select Selection.Sort Key1:=Range("S2"), Order1:=xlAscending, Key2:=Range("EL2") _ , Order2:=xlAscending, Key3:=Range("FM2"), Order3:=xlAscending, Header:=xlGuess, _ OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Second Sort Company, Book ' (Sort Company in descending order to get "AVM" (which is by far the largest) to bottom - makes looping more efficient Range("A2").Select Selection.Sort Key1:=Range("B2"), Order1:=xlDescending, Key2:=Range("C2") _ , Order2:=xlAscending, Header:= _ xlGuess, OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Put titles in first row of array For a = 1 To NumberOfItems myArray(1, a) = Cells(1, MyGet(a)) Next a 'Start at location "B2" Cells(2, 2).Select Set curcell = ActiveCell 'Cycle thru all cells in column "F" until blank cell is encountered ("") 'First Cell with a "" (No data or blanks - absolutely empty) will cause loop to stop Do While curcell.Value < "" If curcell.Value = MyCompany Then Do While curcell.Value = MyCompany i = i + 1 'Put selected columns in the current row into array For j = 1 To NumberOfItems myArray(i, j) = Cells(curcell.Row, MyGet(j)) Next j 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) Loop 'Once company changes no need to continue - data is sorted by company Exit Do Else GoTo NextItem End If NextItem: 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) curcell.Select Loop 'Counts how many items put in array less the first row (Titles) TotalHits = i - 1 'Close database file Windows(Capturefilename).Close 'Activate this workbook ThisWorkbook.Activate GoTo Skip 'Gives error message instead of vb error if file doesn't exist 'And stops macro execution ErrorHandler: MsgBox ("Error occured while trying to get database file" & Chr(13) & "Ensure file " & MyfileName & " exists") End Skip: If TotalHits < 1 Then Range("E2").Select MsgBox ("Did not find any data for selected company !" & Chr(13) & " - Please ensure company code is entered correctly-") End End If End Sub |
#10
![]()
Posted to microsoft.public.excel.programming
|
|||
|
|||
![]()
Alex,
I've tried so many things that I'm not sure what I've done anymore. I did try putting the linked cells into an array and then clearing the linked cells. It worked well but didn't cut down on time. Using arrays is the biggest time-saver I've found but you have to get the data in them first. It was a neat process that I will use with other less bulky files in the future. Bill Manville on this News group suggested that (Thanks again Bill) Thanks for the input. dave "Alex@JPCS" wrote in message .. . Dave, I have followed your thread with Tom, and IMVHO, I think processing this in less than a minute isn't too bad at all. You mentioned: I've also tried creating links to the database file and putting the data into an array, but again this doesn't save time. Have you tried reading this into a VBA array, and doing sorts and manipulations on the VBA array instead of on the sheet? My guess is that 15 of your seconds are related to the multiple read statements. AlexJ "Dave B" <Not.Available wrote in message ... Hello, Excel 2000 I have created a user report generator that gets data from a large (42,000 rows and 208 Columns) closed Excel file. I've optimized my code and creating the report takes 3 seconds AFTER I get the data. The problem is getting the data takes 30 seconds. I want to get the specific data and put it into an array in my code. Then it is very fast to manipulate. My code opens the file, sorts it (primarily by Customer Code) searches through a column that contains Customer Codes, once it finds the first occurence of the customer code, it grabs data from 32 columns (Not all together but scattered) and puts it into an array, then loops until the last occurrence of the customer code is found. Opening the file takes 15 seconds and gathering the data takes 15 seconds resulting in 30 seconds. If it will help, we can create a file that captures the rows that each company's data is in so we can get the data more efficiently. I've tried using ADO with limited success - it doesn't save time and doesn't collect all the data in columns that have varying data types. I've never used ADO and there might be better ways to do it with ADO. I've also tried creating links to the database file and putting the data into an array, but again this doesn't save time. Any suggestions will be appreciated. If you are really into it, I pasted my code below for your perusal and critical comments. Thanx !! dave Below is the code now in use: Option Base 1 Sub Getdata() Application.DisplayAlerts = False Application.screenupdating = False On Error GoTo ErrorHandler Dim i As Long Dim starttime Dim endtime Dim countloop ThisWorkbook.Activate 'Define which columns (in required order) to pull '**If adding/deleting columns MUST change NumberOfItems 'BEST TO ADD AT END - that way it won't affect other subs that look for specific location in MyArray MyGet = Array(2, 3, 21, 128, 155, 149, 150, 151, 152, 64, 15, 11, 12, 14, 148, 13, 156, 1, 5, 6, 7, 8, 16, 19, 142, 166, 170, 169, 171, 172, 173, 174) 'IMPORTANT: Set number of items that above array contains - used throughout this macro NumberOfItems = 32 ReDim myArray(20000, NumberOfItems) As Variant 'Set i to 1 to begin copy at first row which is column heading i = 1 'Open database file MyFileName as selected in opening macro Workbooks.Open Filename:=MyfileName Capturefilename = ActiveWorkbook.Name '+++++++++++++++++++++++++++++++++++++++++++++++++ +++++ 'VERY IMPORTANT: SORTS MUST BE IN CORRECT ORDER 'Logic is tied to company & book changes as well as others '+++++++++++++++++++++++++++++++++++++++++++++++++ ++++++ 'First Sort: FirmAcct, Order1, Tenor Range("A2").Select Selection.Sort Key1:=Range("S2"), Order1:=xlAscending, Key2:=Range("EL2") _ , Order2:=xlAscending, Key3:=Range("FM2"), Order3:=xlAscending, Header:=xlGuess, _ OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Second Sort Company, Book ' (Sort Company in descending order to get "AVM" (which is by far the largest) to bottom - makes looping more efficient Range("A2").Select Selection.Sort Key1:=Range("B2"), Order1:=xlDescending, Key2:=Range("C2") _ , Order2:=xlAscending, Header:= _ xlGuess, OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom 'Put titles in first row of array For a = 1 To NumberOfItems myArray(1, a) = Cells(1, MyGet(a)) Next a 'Start at location "B2" Cells(2, 2).Select Set curcell = ActiveCell 'Cycle thru all cells in column "F" until blank cell is encountered ("") 'First Cell with a "" (No data or blanks - absolutely empty) will cause loop to stop Do While curcell.Value < "" If curcell.Value = MyCompany Then Do While curcell.Value = MyCompany i = i + 1 'Put selected columns in the current row into array For j = 1 To NumberOfItems myArray(i, j) = Cells(curcell.Row, MyGet(j)) Next j 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) Loop 'Once company changes no need to continue - data is sorted by company Exit Do Else GoTo NextItem End If NextItem: 'Increment active cell to next cell below Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0) curcell.Select Loop 'Counts how many items put in array less the first row (Titles) TotalHits = i - 1 'Close database file Windows(Capturefilename).Close 'Activate this workbook ThisWorkbook.Activate GoTo Skip 'Gives error message instead of vb error if file doesn't exist 'And stops macro execution ErrorHandler: MsgBox ("Error occured while trying to get database file" & Chr(13) & "Ensure file " & MyfileName & " exists") End Skip: If TotalHits < 1 Then Range("E2").Select MsgBox ("Did not find any data for selected company !" & Chr(13) & " - Please ensure company code is entered correctly-") End End If End Sub |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Move entire row of data from open to closed file by entering date | Excel Discussion (Misc queries) | |||
Fastest way to select large range (e.g. B3:F1002)? | Excel Discussion (Misc queries) | |||
CAN I IMPORT DATA FROM A CLOSED FILE TO A NEW WORKSHEET | Excel Discussion (Misc queries) | |||
What is fastest way to print labels from Excel data? | New Users to Excel | |||
Excel is not asking to save a changed file when the file is closed | Excel Discussion (Misc queries) |