Tom,
I would have created the report generator in Access but was directed to NOT
use Access. In fact, when I first started this project I loaded Access onto
my computer and was working away when the person heading this project came
by and almost threw up. They were adamant that I not introduce new software
from what they are used to. So, I must do it all in Excel. Can't even
use a report generator although now I've just about written one in
vb to do
what I've been requested to do. I must say that I've learned a lot so it
was a good experience. I'm now mildly stressed-out because I can't get it
any faster with the constrainsts I've been given. I guess they'll be
satisfied once I get a Microsoft expert to say it's as fast as it can be
given the current environment.
Thanks again for your input - when I first started this job a year ago you
pointed me towards the "On-Time" function in Excel (needed DDE links to
update while macro was running). That made me a hero because it paved the
way to convert over 100 LOTUS files to Excel.
dave
"Tom Ogilvy" wrote in message
...
Maybe put the data into Access and use ADO. Might be faster. You can
transfer an ado record set to an array in one step. I believe it will be
transposed from what you would expect.
Regards,
Tom Ogilvy
"Dave B" <Not.Available wrote in message
...
Thanks Tom.
I experimented with creating separate files per customer and it vastly
improves the speed - 35 seconds down to 6 or less seconds. Since most
customers have less than 2,500 rows of data, this seems like the best
solution. This may not be an option since I am dependent on other
developers to re-write their code to create separate files instead of
one
large file. There also may be other "reasons" why we need to keep the
"big"
file like other programs already link to it, etc.
My task is to create an Excel file that the users will use to get their
customer data. If it takes too long, the users don't like it. But, for
what it has to do, I think it runs pretty well at this point. My boss
wants
me to call Microsoft and pay for someone to suggest other ways to do
this
that may be faster or to verify that it is as fast as can be given the
constraints.
I will look into the links you supplied for more ideas - thanks very
much
for your help and advice.
dave
"Tom Ogilvy" wrote in message
...
If you are willing to break you data up into separate files, then
those
wouldn't take nearly as long to open. I might suggest putting the
data
in
.csv files and using low level file IO to open them and parse out the
data.
http://support.microsoft.com/support...eio/fileio.asp
File Access with Visual Basic® for Applications
or if you want to preprocess you data, you might want to look at this:
http://support.microsoft.com/default...42&Product=vb6
HOWTO: Write Data to a File Using WriteFile API
--
Regards,
Tom Ogilvy
"Dave B" <Not.Available wrote in message
...
Hello,
Excel 2000
I have created a user report generator that gets data from a large
(42,000
rows and 208 Columns) closed Excel file.
I've optimized my code and creating the report takes 3 seconds AFTER
I
get
the data. The problem is getting the data takes 30 seconds. I want
to
get
the specific data and put it into an array in my code. Then it is
very
fast
to manipulate.
My code opens the file, sorts it (primarily by Customer Code)
searches
through a column that contains Customer Codes, once it finds the
first
occurence of the customer code, it grabs data from 32 columns (Not
all
together but scattered) and puts it into an array, then loops until
the
last
occurrence of the customer code is found.
Opening the file takes 15 seconds and gathering the data takes 15
seconds
resulting in 30 seconds.
If it will help, we can create a file that captures the rows that
each
company's data is in so we can get the data more efficiently. I've
tried
using ADO with limited success - it doesn't save time and doesn't
collect
all the data in columns that have varying data types. I've never
used
ADO
and there might be better ways to do it with ADO.
I've also tried creating links to the database file and putting the
data
into an array, but again this doesn't save time.
Any suggestions will be appreciated. If you are really into it, I
pasted
my
code below for your perusal and critical comments.
Thanx !!
dave
Below is the code now in use:
Option Base 1
Sub Getdata()
Application.DisplayAlerts = False
Application.screenupdating = False
On Error GoTo ErrorHandler
Dim i As Long
Dim starttime
Dim endtime
Dim countloop
ThisWorkbook.Activate
'Define which columns (in required order) to pull
'**If adding/deleting columns MUST change NumberOfItems
'BEST TO ADD AT END - that way it won't affect other subs that look
for
specific location in MyArray
MyGet = Array(2, 3, 21, 128, 155, 149, 150, 151, 152, 64, 15, 11,
12,
14,
148, 13, 156, 1, 5, 6, 7, 8, 16, 19, 142, 166, 170, 169, 171, 172,
173,
174)
'IMPORTANT: Set number of items that above array contains - used
throughout
this macro
NumberOfItems = 32
ReDim myArray(20000, NumberOfItems) As Variant
'Set i to 1 to begin copy at first row which is column heading
i = 1
'Open database file MyFileName as selected in opening macro
Workbooks.Open Filename:=MyfileName
Capturefilename = ActiveWorkbook.Name
'+++++++++++++++++++++++++++++++++++++++++++++++++ +++++
'VERY IMPORTANT: SORTS MUST BE IN CORRECT ORDER
'Logic is tied to company & book changes as well as others
'+++++++++++++++++++++++++++++++++++++++++++++++++ ++++++
'First Sort: FirmAcct, Order1, Tenor
Range("A2").Select
Selection.Sort Key1:=Range("S2"), Order1:=xlAscending,
Key2:=Range("EL2")
_
, Order2:=xlAscending, Key3:=Range("FM2"), Order3:=xlAscending,
Header:=xlGuess, _
OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom
'Second Sort Company, Book
' (Sort Company in descending order to get "AVM" (which is by far
the
largest) to bottom - makes looping more efficient
Range("A2").Select
Selection.Sort Key1:=Range("B2"), Order1:=xlDescending,
Key2:=Range("C2")
_
, Order2:=xlAscending, Header:= _
xlGuess, OrderCustom:=1, MatchCase:=False,
Orientation:=xlTopToBottom
'Put titles in first row of array
For a = 1 To NumberOfItems
myArray(1, a) = Cells(1, MyGet(a))
Next a
'Start at location "B2"
Cells(2, 2).Select
Set curcell = ActiveCell
'Cycle thru all cells in column "F" until blank cell is encountered
("")
'First Cell with a "" (No data or blanks - absolutely empty) will
cause
loop
to stop
Do While curcell.Value < ""
If curcell.Value = MyCompany Then
Do While curcell.Value = MyCompany
i = i + 1
'Put selected columns in the current row into array
For j = 1 To NumberOfItems
myArray(i, j) = Cells(curcell.Row, MyGet(j))
Next j
'Increment active cell to next cell below
Set curcell = curcell.Offset(rowOffset:=1,
columnOffset:=0)
Loop
'Once company changes no need to continue - data is sorted by
company
Exit Do
Else
GoTo NextItem
End If
NextItem:
'Increment active cell to next cell below
Set curcell = curcell.Offset(rowOffset:=1, columnOffset:=0)
curcell.Select
Loop
'Counts how many items put in array less the first row (Titles)
TotalHits = i - 1
'Close database file
Windows(Capturefilename).Close
'Activate this workbook
ThisWorkbook.Activate
GoTo Skip
'Gives error message instead of vb error if file doesn't exist
'And stops macro execution
ErrorHandler:
MsgBox ("Error occured while trying to get database file" & Chr(13)
&
"Ensure file " & MyfileName & " exists")
End
Skip:
If TotalHits < 1 Then
Range("E2").Select
MsgBox ("Did not find any data for selected company !" & Chr(13)
&
" -
Please ensure company code is entered correctly-")
End
End If
End Sub