Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #1   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 735
Default Opening/Closing Large Files

I have a VBA process that reads thru a single folder (on a corporate
network - high performance) of Excel workbooks, sequentially opens, extracts
a small number of data, then close without save. Only one workbook is open
at anyone time. There are 1350 workbooks which are all about ~7MB in size.

When the process first starts files are opened and closed reasonably
quickly, at about 12 per minute, as the process progresses it gets slower
and slower and eventually appears to stop (no error messages).

It appears as if system resources are being consumed as each file is
processed that are not released. Is this possible or is something else
causing the problem? My code is as follows.... the names of each files to
process are already stored on sheet (shcontrol) column A

Sub ProcessFiles()

Dim dataWb As Workbook
Dim xr As Long, nextData As Long

Application.EnableEvents = False
Application.ScreenUpdating = False

' start row for data
nextData = 1
' process each workbook
For xr = 1 To 1350

' open data workbook
Set dataWb = Workbooks.Open("H:\myFiles\" & shControl.Cells(xr, 1))

' get data
With dataWb.Sheets("Appraisal")
shControl.Cells(nextData, 4) = .Cells(10, "I").Value
shControl.Cells(nextData, 5) = .Cells(11, "I").Value
shControl.Cells(nextData, 6) = .Cells(12, "I").Value
End With

' advance o/p row
nextData = nextData + 1

' close workbook
dataWb.Close False

Next xr

Application.EnableEvents = True
Application.ScreenUpdating = True

End Sub

--

Regards,
Nigel




  #2   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 58
Default Opening/Closing Large Files

On Jan 24, 7:49 am, "Nigel" wrote:
I have a VBA process that reads thru a single folder (on a corporate
network - high performance) of Excel workbooks, sequentially opens, extracts
a small number of data, then close without save. Only one workbook is open
at anyone time. There are 1350 workbooks which are all about ~7MB in size.

When the process first starts files are opened and closed reasonably
quickly, at about 12 per minute, as the process progresses it gets slower
and slower and eventually appears to stop (no error messages).

It appears as if system resources are being consumed as each file is
processed that are not released. Is this possible or is something else
causing the problem? My code is as follows.... the names of each files to
process are already stored on sheet (shcontrol) column A

Sub ProcessFiles()

Dim dataWb As Workbook
Dim xr As Long, nextData As Long

Application.EnableEvents = False
Application.ScreenUpdating = False

' start row for data
nextData = 1
' process each workbook
For xr = 1 To 1350

' open data workbook
Set dataWb = Workbooks.Open("H:\myFiles\" & shControl.Cells(xr, 1))

' get data
With dataWb.Sheets("Appraisal")
shControl.Cells(nextData, 4) = .Cells(10, "I").Value
shControl.Cells(nextData, 5) = .Cells(11, "I").Value
shControl.Cells(nextData, 6) = .Cells(12, "I").Value
End With

' advance o/p row
nextData = nextData + 1

' close workbook
dataWb.Close False

Next xr

Application.EnableEvents = True
Application.ScreenUpdating = True

End Sub

--

Regards,
Nigel


Look at John Walkenbach's site for article on reading from closed
workbooks:
http://j-walk.com/ss/excel/tips/tip82.htm
Also:
shControl.Cells(nextData, 4) might be shControl.Cells(nextData,
4).value
Maybe you should set dataWb=nothing on every loop.
  #3   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 735
Default Opening/Closing Large Files

Thanks for the tip, the get data from closed file using John Walkenbach code
looks interesting. I will try that out.

--

Regards,
Nigel




"dbKemp" wrote in message
...
On Jan 24, 7:49 am, "Nigel" wrote:
I have a VBA process that reads thru a single folder (on a corporate
network - high performance) of Excel workbooks, sequentially opens,
extracts
a small number of data, then close without save. Only one workbook is
open
at anyone time. There are 1350 workbooks which are all about ~7MB in
size.

When the process first starts files are opened and closed reasonably
quickly, at about 12 per minute, as the process progresses it gets slower
and slower and eventually appears to stop (no error messages).

It appears as if system resources are being consumed as each file is
processed that are not released. Is this possible or is something else
causing the problem? My code is as follows.... the names of each files
to
process are already stored on sheet (shcontrol) column A

Sub ProcessFiles()

Dim dataWb As Workbook
Dim xr As Long, nextData As Long

Application.EnableEvents = False
Application.ScreenUpdating = False

' start row for data
nextData = 1
' process each workbook
For xr = 1 To 1350

' open data workbook
Set dataWb = Workbooks.Open("H:\myFiles\" & shControl.Cells(xr, 1))

' get data
With dataWb.Sheets("Appraisal")
shControl.Cells(nextData, 4) = .Cells(10, "I").Value
shControl.Cells(nextData, 5) = .Cells(11, "I").Value
shControl.Cells(nextData, 6) = .Cells(12, "I").Value
End With

' advance o/p row
nextData = nextData + 1

' close workbook
dataWb.Close False

Next xr

Application.EnableEvents = True
Application.ScreenUpdating = True

End Sub

--

Regards,
Nigel


Look at John Walkenbach's site for article on reading from closed
workbooks:
http://j-walk.com/ss/excel/tips/tip82.htm
Also:
shControl.Cells(nextData, 4) might be shControl.Cells(nextData,
4).value
Maybe you should set dataWb=nothing on every loop.


  #4   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 35,218
Default Opening/Closing Large Files

I don't have a good suggestion.

Maybe a bad one. Keep track of what file you're processing and after a few
(before it slows down to a crawl), close (and save the file), close excel,
reopen excel and your file and do the next batch.

Maybe closing excel woudn't be necessary???

======
And I've never had to do this with 1350 workbooks, but since you're just
retrieving 3 values, maybe it would be quicker to build formulas and plop them
into the cells directly.

If it slows down after a bunch, then do it in groups and convert the formulas to
values for each bunch.

Just guesses, though. I'd test on a much smaller number of files, too.

Nigel wrote:

I have a VBA process that reads thru a single folder (on a corporate
network - high performance) of Excel workbooks, sequentially opens, extracts
a small number of data, then close without save. Only one workbook is open
at anyone time. There are 1350 workbooks which are all about ~7MB in size.

When the process first starts files are opened and closed reasonably
quickly, at about 12 per minute, as the process progresses it gets slower
and slower and eventually appears to stop (no error messages).

It appears as if system resources are being consumed as each file is
processed that are not released. Is this possible or is something else
causing the problem? My code is as follows.... the names of each files to
process are already stored on sheet (shcontrol) column A

Sub ProcessFiles()

Dim dataWb As Workbook
Dim xr As Long, nextData As Long

Application.EnableEvents = False
Application.ScreenUpdating = False

' start row for data
nextData = 1
' process each workbook
For xr = 1 To 1350

' open data workbook
Set dataWb = Workbooks.Open("H:\myFiles\" & shControl.Cells(xr, 1))

' get data
With dataWb.Sheets("Appraisal")
shControl.Cells(nextData, 4) = .Cells(10, "I").Value
shControl.Cells(nextData, 5) = .Cells(11, "I").Value
shControl.Cells(nextData, 6) = .Cells(12, "I").Value
End With

' advance o/p row
nextData = nextData + 1

' close workbook
dataWb.Close False

Next xr

Application.EnableEvents = True
Application.ScreenUpdating = True

End Sub

--

Regards,
Nigel


--

Dave Peterson
  #5   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 735
Default Opening/Closing Large Files

Hi Dave,
Well it might come to that. . The files are tracked and I am considering
creating 10 sub folders with less then 135 files each as this number appears
to work OK. So to get it done I will probably do that.

I do want to discover what is causing this breakdown though so I shall run
some more tests as I am not sure why the system slows or indeed if it is
Excel or the OpSys.

Cheers

--

Regards,
Nigel




"Dave Peterson" wrote in message
...
I don't have a good suggestion.

Maybe a bad one. Keep track of what file you're processing and after a
few
(before it slows down to a crawl), close (and save the file), close excel,
reopen excel and your file and do the next batch.

Maybe closing excel woudn't be necessary???

======
And I've never had to do this with 1350 workbooks, but since you're just
retrieving 3 values, maybe it would be quicker to build formulas and plop
them
into the cells directly.

If it slows down after a bunch, then do it in groups and convert the
formulas to
values for each bunch.

Just guesses, though. I'd test on a much smaller number of files, too.

Nigel wrote:

I have a VBA process that reads thru a single folder (on a corporate
network - high performance) of Excel workbooks, sequentially opens,
extracts
a small number of data, then close without save. Only one workbook is
open
at anyone time. There are 1350 workbooks which are all about ~7MB in
size.

When the process first starts files are opened and closed reasonably
quickly, at about 12 per minute, as the process progresses it gets slower
and slower and eventually appears to stop (no error messages).

It appears as if system resources are being consumed as each file is
processed that are not released. Is this possible or is something else
causing the problem? My code is as follows.... the names of each files
to
process are already stored on sheet (shcontrol) column A

Sub ProcessFiles()

Dim dataWb As Workbook
Dim xr As Long, nextData As Long

Application.EnableEvents = False
Application.ScreenUpdating = False

' start row for data
nextData = 1
' process each workbook
For xr = 1 To 1350

' open data workbook
Set dataWb = Workbooks.Open("H:\myFiles\" & shControl.Cells(xr, 1))

' get data
With dataWb.Sheets("Appraisal")
shControl.Cells(nextData, 4) = .Cells(10, "I").Value
shControl.Cells(nextData, 5) = .Cells(11, "I").Value
shControl.Cells(nextData, 6) = .Cells(12, "I").Value
End With

' advance o/p row
nextData = nextData + 1

' close workbook
dataWb.Close False

Next xr

Application.EnableEvents = True
Application.ScreenUpdating = True

End Sub

--

Regards,
Nigel


--

Dave Peterson




  #6   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 35,218
Default Opening/Closing Large Files

I'd really consider building the formulas--I'm not sure if you read that
portion.

Nigel wrote:

Hi Dave,
Well it might come to that. . The files are tracked and I am considering
creating 10 sub folders with less then 135 files each as this number appears
to work OK. So to get it done I will probably do that.

I do want to discover what is causing this breakdown though so I shall run
some more tests as I am not sure why the system slows or indeed if it is
Excel or the OpSys.

Cheers

--

Regards,
Nigel


"Dave Peterson" wrote in message
...
I don't have a good suggestion.

Maybe a bad one. Keep track of what file you're processing and after a
few
(before it slows down to a crawl), close (and save the file), close excel,
reopen excel and your file and do the next batch.

Maybe closing excel woudn't be necessary???

======
And I've never had to do this with 1350 workbooks, but since you're just
retrieving 3 values, maybe it would be quicker to build formulas and plop
them
into the cells directly.

If it slows down after a bunch, then do it in groups and convert the
formulas to
values for each bunch.

Just guesses, though. I'd test on a much smaller number of files, too.

Nigel wrote:

I have a VBA process that reads thru a single folder (on a corporate
network - high performance) of Excel workbooks, sequentially opens,
extracts
a small number of data, then close without save. Only one workbook is
open
at anyone time. There are 1350 workbooks which are all about ~7MB in
size.

When the process first starts files are opened and closed reasonably
quickly, at about 12 per minute, as the process progresses it gets slower
and slower and eventually appears to stop (no error messages).

It appears as if system resources are being consumed as each file is
processed that are not released. Is this possible or is something else
causing the problem? My code is as follows.... the names of each files
to
process are already stored on sheet (shcontrol) column A

Sub ProcessFiles()

Dim dataWb As Workbook
Dim xr As Long, nextData As Long

Application.EnableEvents = False
Application.ScreenUpdating = False

' start row for data
nextData = 1
' process each workbook
For xr = 1 To 1350

' open data workbook
Set dataWb = Workbooks.Open("H:\myFiles\" & shControl.Cells(xr, 1))

' get data
With dataWb.Sheets("Appraisal")
shControl.Cells(nextData, 4) = .Cells(10, "I").Value
shControl.Cells(nextData, 5) = .Cells(11, "I").Value
shControl.Cells(nextData, 6) = .Cells(12, "I").Value
End With

' advance o/p row
nextData = nextData + 1

' close workbook
dataWb.Close False

Next xr

Application.EnableEvents = True
Application.ScreenUpdating = True

End Sub

--

Regards,
Nigel


--

Dave Peterson


--

Dave Peterson
  #7   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 735
Default Opening/Closing Large Files

Yes, I did read it and am going to give it a try, together with the read
closed file approach from John W site.

Thanks

--

Regards,
Nigel




"Dave Peterson" wrote in message
...
I'd really consider building the formulas--I'm not sure if you read that
portion.

Nigel wrote:

Hi Dave,
Well it might come to that. . The files are tracked and I am
considering
creating 10 sub folders with less then 135 files each as this number
appears
to work OK. So to get it done I will probably do that.

I do want to discover what is causing this breakdown though so I shall
run
some more tests as I am not sure why the system slows or indeed if it is
Excel or the OpSys.

Cheers

--

Regards,
Nigel


"Dave Peterson" wrote in message
...
I don't have a good suggestion.

Maybe a bad one. Keep track of what file you're processing and after a
few
(before it slows down to a crawl), close (and save the file), close
excel,
reopen excel and your file and do the next batch.

Maybe closing excel woudn't be necessary???

======
And I've never had to do this with 1350 workbooks, but since you're
just
retrieving 3 values, maybe it would be quicker to build formulas and
plop
them
into the cells directly.

If it slows down after a bunch, then do it in groups and convert the
formulas to
values for each bunch.

Just guesses, though. I'd test on a much smaller number of files, too.

Nigel wrote:

I have a VBA process that reads thru a single folder (on a corporate
network - high performance) of Excel workbooks, sequentially opens,
extracts
a small number of data, then close without save. Only one workbook is
open
at anyone time. There are 1350 workbooks which are all about ~7MB in
size.

When the process first starts files are opened and closed reasonably
quickly, at about 12 per minute, as the process progresses it gets
slower
and slower and eventually appears to stop (no error messages).

It appears as if system resources are being consumed as each file is
processed that are not released. Is this possible or is something
else
causing the problem? My code is as follows.... the names of each
files
to
process are already stored on sheet (shcontrol) column A

Sub ProcessFiles()

Dim dataWb As Workbook
Dim xr As Long, nextData As Long

Application.EnableEvents = False
Application.ScreenUpdating = False

' start row for data
nextData = 1
' process each workbook
For xr = 1 To 1350

' open data workbook
Set dataWb = Workbooks.Open("H:\myFiles\" & shControl.Cells(xr, 1))

' get data
With dataWb.Sheets("Appraisal")
shControl.Cells(nextData, 4) = .Cells(10, "I").Value
shControl.Cells(nextData, 5) = .Cells(11, "I").Value
shControl.Cells(nextData, 6) = .Cells(12, "I").Value
End With

' advance o/p row
nextData = nextData + 1

' close workbook
dataWb.Close False

Next xr

Application.EnableEvents = True
Application.ScreenUpdating = True

End Sub

--

Regards,
Nigel


--

Dave Peterson


--

Dave Peterson


  #8   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 3,290
Default Opening/Closing Large Files


Nigel,
Since I am not on a network, I hesitate to reply.
Your other responses cover most of the ideas I considered,
however, you also might try using an object reference instead of
the sheet code name - "shControl".
Also, is there any kind of security protection on the network that
might shutdown access after repeated access requests?
That might be something your IT people could have set up.
--
Jim Cone
San Francisco, USA
http://www.realezsites.com/bus/primitivesoftware
(Excel Add-ins / Excel Programming)



"Nigel"
wrote in message
I have a VBA process that reads thru a single folder (on a corporate
network - high performance) of Excel workbooks, sequentially opens, extracts
a small number of data, then close without save. Only one workbook is open
at anyone time. There are 1350 workbooks which are all about ~7MB in size.

When the process first starts files are opened and closed reasonably
quickly, at about 12 per minute, as the process progresses it gets slower
and slower and eventually appears to stop (no error messages).

It appears as if system resources are being consumed as each file is
processed that are not released. Is this possible or is something else
causing the problem? My code is as follows.... the names of each files to
process are already stored on sheet (shcontrol) column A

-snip-
Regards,
Nigel




  #9   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 735
Default Opening/Closing Large Files

Hi Jim

The object reference I will try.

I have just checked the PC resources and it does not appear to be
accumulating more, as you might expect the resources go down as the file is
opened and return to similar levels after the file is closed. I cannot see
any leakage and I am stumped.

Good point on the network front, I need to check with the IT guys, but this
slowdown is progressive as more files get processed it get slower. From 5
secs per file at the start down to 15 secs per file by 100 files, 30 secs
per file by 200.....etc. Not sure the network would slow like this unless
there is some form of bandwidth throttling?



--

Regards,
Nigel




"Jim Cone" wrote in message
...

Nigel,
Since I am not on a network, I hesitate to reply.
Your other responses cover most of the ideas I considered,
however, you also might try using an object reference instead of
the sheet code name - "shControl".
Also, is there any kind of security protection on the network that
might shutdown access after repeated access requests?
That might be something your IT people could have set up.
--
Jim Cone
San Francisco, USA
http://www.realezsites.com/bus/primitivesoftware
(Excel Add-ins / Excel Programming)



"Nigel"
wrote in message
I have a VBA process that reads thru a single folder (on a corporate
network - high performance) of Excel workbooks, sequentially opens,
extracts
a small number of data, then close without save. Only one workbook is
open
at anyone time. There are 1350 workbooks which are all about ~7MB in
size.

When the process first starts files are opened and closed reasonably
quickly, at about 12 per minute, as the process progresses it gets slower
and slower and eventually appears to stop (no error messages).

It appears as if system resources are being consumed as each file is
processed that are not released. Is this possible or is something else
causing the problem? My code is as follows.... the names of each files
to
process are already stored on sheet (shcontrol) column A

-snip-
Regards,
Nigel





  #10   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 968
Default Opening/Closing Large Files

I suspect its the string pool ...
AFAIK the way Excel works it stores every unique string it finds in the
string pool and then every time it finds a string it looks it up in the
string pool to see if it exists.
I think the string pool only ever gets larger and slower within an excel
session.

If it is the root cause of the problem then the only solution is to close
Excel partway through and start from where you left off.

Charles
__________________________________________________
Outlines for my Sessions at the Australia Excel Users Group
http://www.decisionmodels.com/OZEUC.htm


"Nigel" wrote in message
...
Hi Jim

The object reference I will try.

I have just checked the PC resources and it does not appear to be
accumulating more, as you might expect the resources go down as the file
is opened and return to similar levels after the file is closed. I cannot
see any leakage and I am stumped.

Good point on the network front, I need to check with the IT guys, but
this slowdown is progressive as more files get processed it get slower.
From 5 secs per file at the start down to 15 secs per file by 100 files,
30 secs per file by 200.....etc. Not sure the network would slow like
this unless there is some form of bandwidth throttling?



--

Regards,
Nigel




"Jim Cone" wrote in message
...

Nigel,
Since I am not on a network, I hesitate to reply.
Your other responses cover most of the ideas I considered,
however, you also might try using an object reference instead of
the sheet code name - "shControl".
Also, is there any kind of security protection on the network that
might shutdown access after repeated access requests?
That might be something your IT people could have set up.
--
Jim Cone
San Francisco, USA
http://www.realezsites.com/bus/primitivesoftware
(Excel Add-ins / Excel Programming)



"Nigel"
wrote in message
I have a VBA process that reads thru a single folder (on a corporate
network - high performance) of Excel workbooks, sequentially opens,
extracts
a small number of data, then close without save. Only one workbook is
open
at anyone time. There are 1350 workbooks which are all about ~7MB in
size.

When the process first starts files are opened and closed reasonably
quickly, at about 12 per minute, as the process progresses it gets slower
and slower and eventually appears to stop (no error messages).

It appears as if system resources are being consumed as each file is
processed that are not released. Is this possible or is something else
causing the problem? My code is as follows.... the names of each files
to
process are already stored on sheet (shcontrol) column A

-snip-
Regards,
Nigel









  #11   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 735
Default Opening/Closing Large Files

Good call. I read the MS article on the string pool and one fix is to
create a large string pool by filling a temporary sheet with a large number
of strings. When re-opened this pool results in Excel not adding more as it
already exists. Sample code below, which I am about to put to the test.

Public Sub GrowStringPool()
Dim row As Integer
Dim col As Integer
Dim text As String
text = "Here is some text:"
For col = 1 To 21
For row = 1 To 1000
Cells(row, col).Value = text + Str(row * col)
row = row + 1
Next
col = col + 1
Next
End Sub

--

Regards,
Nigel




"Charles Williams" wrote in message
...
I suspect its the string pool ...
AFAIK the way Excel works it stores every unique string it finds in the
string pool and then every time it finds a string it looks it up in the
string pool to see if it exists.
I think the string pool only ever gets larger and slower within an excel
session.

If it is the root cause of the problem then the only solution is to close
Excel partway through and start from where you left off.

Charles
__________________________________________________
Outlines for my Sessions at the Australia Excel Users Group
http://www.decisionmodels.com/OZEUC.htm


"Nigel" wrote in message
...
Hi Jim

The object reference I will try.

I have just checked the PC resources and it does not appear to be
accumulating more, as you might expect the resources go down as the file
is opened and return to similar levels after the file is closed. I
cannot see any leakage and I am stumped.

Good point on the network front, I need to check with the IT guys, but
this slowdown is progressive as more files get processed it get slower.
From 5 secs per file at the start down to 15 secs per file by 100 files,
30 secs per file by 200.....etc. Not sure the network would slow like
this unless there is some form of bandwidth throttling?



--

Regards,
Nigel




"Jim Cone" wrote in message
...

Nigel,
Since I am not on a network, I hesitate to reply.
Your other responses cover most of the ideas I considered,
however, you also might try using an object reference instead of
the sheet code name - "shControl".
Also, is there any kind of security protection on the network that
might shutdown access after repeated access requests?
That might be something your IT people could have set up.
--
Jim Cone
San Francisco, USA
http://www.realezsites.com/bus/primitivesoftware
(Excel Add-ins / Excel Programming)



"Nigel"
wrote in message
I have a VBA process that reads thru a single folder (on a corporate
network - high performance) of Excel workbooks, sequentially opens,
extracts
a small number of data, then close without save. Only one workbook is
open
at anyone time. There are 1350 workbooks which are all about ~7MB in
size.

When the process first starts files are opened and closed reasonably
quickly, at about 12 per minute, as the process progresses it gets
slower
and slower and eventually appears to stop (no error messages).

It appears as if system resources are being consumed as each file is
processed that are not released. Is this possible or is something else
causing the problem? My code is as follows.... the names of each files
to
process are already stored on sheet (shcontrol) column A

-snip-
Regards,
Nigel








  #12   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 735
Default Opening/Closing Large Files

It worked!

I added a large string pool, re-opened the file and run my code. The total
load time for a few files trebled. Clearing the sheet with the strings on
and re-running the code, reduced the load time back. This appears to be the
evidence that the string pool is an issue.

I am now testing with a high volume of files.

Thanks for the help

--

Regards,
Nigel




"Nigel" wrote in message
...
Good call. I read the MS article on the string pool and one fix is to
create a large string pool by filling a temporary sheet with a large
number of strings. When re-opened this pool results in Excel not adding
more as it already exists. Sample code below, which I am about to put to
the test.

Public Sub GrowStringPool()
Dim row As Integer
Dim col As Integer
Dim text As String
text = "Here is some text:"
For col = 1 To 21
For row = 1 To 1000
Cells(row, col).Value = text + Str(row * col)
row = row + 1
Next
col = col + 1
Next
End Sub

--

Regards,
Nigel




"Charles Williams" wrote in message
...
I suspect its the string pool ...
AFAIK the way Excel works it stores every unique string it finds in the
string pool and then every time it finds a string it looks it up in the
string pool to see if it exists.
I think the string pool only ever gets larger and slower within an excel
session.

If it is the root cause of the problem then the only solution is to close
Excel partway through and start from where you left off.

Charles
__________________________________________________
Outlines for my Sessions at the Australia Excel Users Group
http://www.decisionmodels.com/OZEUC.htm


"Nigel" wrote in message
...
Hi Jim

The object reference I will try.

I have just checked the PC resources and it does not appear to be
accumulating more, as you might expect the resources go down as the file
is opened and return to similar levels after the file is closed. I
cannot see any leakage and I am stumped.

Good point on the network front, I need to check with the IT guys, but
this slowdown is progressive as more files get processed it get slower.
From 5 secs per file at the start down to 15 secs per file by 100 files,
30 secs per file by 200.....etc. Not sure the network would slow like
this unless there is some form of bandwidth throttling?



--

Regards,
Nigel




"Jim Cone" wrote in message
...

Nigel,
Since I am not on a network, I hesitate to reply.
Your other responses cover most of the ideas I considered,
however, you also might try using an object reference instead of
the sheet code name - "shControl".
Also, is there any kind of security protection on the network that
might shutdown access after repeated access requests?
That might be something your IT people could have set up.
--
Jim Cone
San Francisco, USA
http://www.realezsites.com/bus/primitivesoftware
(Excel Add-ins / Excel Programming)



"Nigel"
wrote in message
I have a VBA process that reads thru a single folder (on a corporate
network - high performance) of Excel workbooks, sequentially opens,
extracts
a small number of data, then close without save. Only one workbook is
open
at anyone time. There are 1350 workbooks which are all about ~7MB in
size.

When the process first starts files are opened and closed reasonably
quickly, at about 12 per minute, as the process progresses it gets
slower
and slower and eventually appears to stop (no error messages).

It appears as if system resources are being consumed as each file is
processed that are not released. Is this possible or is something else
causing the problem? My code is as follows.... the names of each
files to
process are already stored on sheet (shcontrol) column A

-snip-
Regards,
Nigel









  #13   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 3,290
Default Opening/Closing Large Files


Some more thoughts for what they are worth...
1. Those wouldn't all be xl 2003 workbooks that calculate when they open?
2. Could you move the folder to your own drive, do your work and
move it back?
3. Dave's idea about using formulas seems more like the way to go.
You wouldn't have to open each workbook and since the cell locations
are fixed they couldn't be very complicated.
--
Jim Cone
San Francisco, USA
http://www.realezsites.com/bus/primitivesoftware
(Excel Add-ins / Excel Programming)



"Nigel" wrote in message
Hi Jim
The object reference I will try.

I have just checked the PC resources and it does not appear to be
accumulating more, as you might expect the resources go down as the file is
opened and return to similar levels after the file is closed. I cannot see
any leakage and I am stumped.

Good point on the network front, I need to check with the IT guys, but this
slowdown is progressive as more files get processed it get slower. From 5
secs per file at the start down to 15 secs per file by 100 files, 30 secs
per file by 200.....etc. Not sure the network would slow like this unless
there is some form of bandwidth throttling?
--

Regards,
Nigel



  #14   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 735
Default Opening/Closing Large Files

Hi Jim
They are xl2003 and the calculation is not switched to manual, so I will do
that but in fact there is not much to calculate in these looks, they contain
a lot of forms and formatting, hence the large size.

I have not tried to move them locally so that certainly might be an option
if the network is the limiting factor.

I am working on a formula approach and will be testing it soon.

Thanks


--

Regards,
Nigel




"Jim Cone" wrote in message
...

Some more thoughts for what they are worth...
1. Those wouldn't all be xl 2003 workbooks that calculate when they open?
2. Could you move the folder to your own drive, do your work and
move it back?
3. Dave's idea about using formulas seems more like the way to go.
You wouldn't have to open each workbook and since the cell locations
are fixed they couldn't be very complicated.
--
Jim Cone
San Francisco, USA
http://www.realezsites.com/bus/primitivesoftware
(Excel Add-ins / Excel Programming)



"Nigel" wrote in message
Hi Jim
The object reference I will try.

I have just checked the PC resources and it does not appear to be
accumulating more, as you might expect the resources go down as the file
is
opened and return to similar levels after the file is closed. I cannot
see
any leakage and I am stumped.

Good point on the network front, I need to check with the IT guys, but
this
slowdown is progressive as more files get processed it get slower. From 5
secs per file at the start down to 15 secs per file by 100 files, 30 secs
per file by 200.....etc. Not sure the network would slow like this
unless
there is some form of bandwidth throttling?
--

Regards,
Nigel




  #15   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 2
Default Opening/Closing Large Files

Nigel,

You may have to add an "erase dataWb" statement inside the for/next
loop. I believe objects may not be overwritten when you re-set them,
which may still cause a memory leak, even though you may not actually
be running out of memory.

Good luck,


Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
OPENING AND CLOSING OF EXCEL 07 FILES hot-rod-3 Excel Discussion (Misc queries) 2 July 23rd 09 12:30 AM
Excel 2003/2007 Crash when opening/closing files Richard NYC Excel Discussion (Misc queries) 1 October 18th 07 08:22 AM
Opening large text files freekrill Excel Discussion (Misc queries) 6 September 7th 05 04:44 AM
opening and closing files in order with vba shane[_2_] Excel Programming 1 April 13th 04 12:00 AM
Opening and closing text files Vince[_3_] Excel Programming 1 August 11th 03 09:43 PM


All times are GMT +1. The time now is 03:22 PM.

Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright ©2004-2025 ExcelBanter.
The comments are property of their posters.
 

About Us

"It's about Microsoft Excel"