Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #1   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 238
Default Speeding up the importing of data from file

I am using the following to import binary data from a file and convert
to hexadecimal. Is there a different way to do this that would be
faster? I got list help last year to speed up my checksum routine by
putting the data into a variant array. I was able to also us it for
saving to a file too. I need some of the same magic importing data
from a file. File size can be over a meg in size and is taking minutes
to load.
__________________________________________________ ___

Dim fs, f, ts, s
Set fs = CreateObject("Scripting.FileSystemObject")
Set f = fs.GetFile(FileNameWithPath)
Set ts = f.OpenAsTextStream(ForReading, TristateUseDefault)
ColumnCount = 3
RowCount = 1
Do While ts.atendofstream = False 'Loop to fill column with HEX
numbers
DecimalByte = Asc(ts.Read(1))
HexByte = Hex(DecimalByte)
Cells(RowCount, ColumnCount) = HexByte
RowCount = RowCount + 1
Loop
ts.Close
  #2   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 238
Default Speeding up the importing of data from file

I was experimenting with this
------------------------------------------------------------------------
filenumber = FreeFile
Open FullFileName For Binary As filenumber
MyString = Input(FileLen(FullFileName), filenumber)
Close filenumber
------------------------------------------------------------------------
I was able to read a 4Meg file into MyString. I thought we were
limited to 64k?
  #3   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 3,355
Default Speeding up the importing of data from file

I'd add this at the beginning of code

Application.ScreenUpdating = false
Application.calculation = XLCalculationManual

and at the end

Application.ScreenUpdating = True
Application.Calculation = xlcalculationAutomatic
--
HTH,

Barb Reinhardt



"Fan924" wrote:

I am using the following to import binary data from a file and convert
to hexadecimal. Is there a different way to do this that would be
faster? I got list help last year to speed up my checksum routine by
putting the data into a variant array. I was able to also us it for
saving to a file too. I need some of the same magic importing data
from a file. File size can be over a meg in size and is taking minutes
to load.
__________________________________________________ ___

Dim fs, f, ts, s
Set fs = CreateObject("Scripting.FileSystemObject")
Set f = fs.GetFile(FileNameWithPath)
Set ts = f.OpenAsTextStream(ForReading, TristateUseDefault)
ColumnCount = 3
RowCount = 1
Do While ts.atendofstream = False 'Loop to fill column with HEX
numbers
DecimalByte = Asc(ts.Read(1))
HexByte = Hex(DecimalByte)
Cells(RowCount, ColumnCount) = HexByte
RowCount = RowCount + 1
Loop
ts.Close
.

  #4   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 238
Default Speeding up the importing of data from file

Already there.
  #5   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 15
Default Speeding up the importing of data from file

On Jan 24, 11:36*am, Fan924 wrote:
I am using the following to import binary data from a file and convert
to hexadecimal. Is there a different way to do this that would be
faster? *I got list help last year to speed up my checksum routine by
putting the data into a variant array. I was able to also us it for
saving to a file too. I need some of the same magic importing data
from a file. File size can be over a meg in size and is taking minutes
to load.
__________________________________________________ ___

Dim fs, f, ts, s
* * Set fs = CreateObject("Scripting.FileSystemObject")
* * Set f = fs.GetFile(FileNameWithPath)
* * Set ts = f.OpenAsTextStream(ForReading, TristateUseDefault)
* * ColumnCount = 3
* * RowCount = 1
Do While ts.atendofstream = False 'Loop to fill column with HEX
numbers
* * DecimalByte = Asc(ts.Read(1))
* * HexByte = Hex(DecimalByte)
* * Cells(RowCount, ColumnCount) = HexByte
* * RowCount = RowCount + 1
Loop
ts.Close


I would change the way you're returning results to the spreadsheet.
Try populating an array in the loop, and then transferring values all
at once at the end rather than a cell at a time - this is likely to be
a lot faster. Reading the file into the textstream does not take that
long, even for a file of the size you're dealing with - I timed
something similar in Python at around 18 seconds for 100 iterations to
read a 250KB pdf, convert byte-by-byte to ansi code, then hex, then
return to an array. VBScript is a bit slower perhaps, but certainly
with Excel transferring data to the worksheet is the more likely place
for a bottleneck to occur.


  #6   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 1
Default Speeding up the importing of data from file


He is some more info on reading files

1) You don't read into a string. You are using an I/O function in the
windows operating system that may not read the entire file at once.
that is why you are able to read more the 64K

2) Windows when it opens a file doesn't different things depending on
the amount of memory in your computer and the size of the file.
a) You computer has some very fast memory and Windows tries to
put the entire file into that memory. This takes time to move a large
file in to this memory from a slower hard disk so it is done in the
background on large files so your program can continue to run.

b) Part fast memory is a I/O buffer that you program directly
uses and is controller by VBA. Windows will detect when the I/O buffer
empties windows will fill the buffer with more data from the file. I'm
not sure what the buffer size is today (haven't looked in a while) but
used to be around 64K bytes.

B) But it must share that memory with other process. So
depending on the size of the file some of the file may get moved to swap
space in slower memory or even back intp swap space on the hard drive.

3) Reading a file sequentiually (one line at a time) is slow but
requires only a small amount of memory *the I/O buffer)

4) Using the Scripting.FileSystemObject object the is a method READALL
that will read the entire file in one statement. This could speed up
that program, or slow it down. Reading the entore file could put some
of the data into swap space on the hard drive. I haven't had the need
to in years at analysing all the combinations of events to give an
accurate accounting of what wil happening today on your computer with
the size of the file and the amount of memory in your computer. I just
understand very well the theory!


--
joel
------------------------------------------------------------------------
joel's Profile: 229
View this thread: http://www.thecodecage.com/forumz/sh...d.php?t=172880

Microsoft Office Help

Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Speeding up the importing of data from file Fan924 Excel Programming 1 January 24th 10 01:59 AM
importing data from a txt file rrlogan Excel Worksheet Functions 1 May 15th 09 07:37 PM
Importing data file Mohit Excel Discussion (Misc queries) 1 June 1st 06 07:45 PM
importing data from txt file onelson Excel Discussion (Misc queries) 0 August 2nd 05 02:24 PM
Importing Data from another Excel File Pasha Excel Programming 1 January 25th 04 12:04 PM


All times are GMT +1. The time now is 10:03 AM.

Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright ©2004-2025 ExcelBanter.
The comments are property of their posters.
 

About Us

"It's about Microsoft Excel"