Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #1   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 99
Default Server process

Hi,
We have a serious performance problem with an Excel application when it is
used on a satellite station (any one of the many there are) but there is no
problem when the same application is used on the central server. The WAN is
configured properly but the data to be read is always increasing and I'm
trying to find a solution to it, asap.
The problem appears because the Excel application has to open between 1500
and 2000 excel files to produce a report. This task necessarily takes a long
time to execute because of the 256 or 512 K band width that is used.
I thought of a possible solution but wander if it possible and if it would
be good enough as performance gain. Here is my idea:
I would split the application in two and place the code that opens the 2000
Excel files in a second Excel application on the central server. Then, I
would use the first application from the satellite site and call the process
(the second Excel application) that resides on the central server to do the
dirty job directly on that central server where performance is acceptable.
With such a set up, will the process be faster? Will the server memory be
used or will the process be executed on the satellite site using this
station's memory?
Does any one knows?
Any ideas?
Thank.
--
Jac Tremblay
  #2   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 4,391
Default Server process

Jac,
I'm no network admin, but here's my 2c..

Running from the workstation obviously means that the 2000 files have be
transferred over the network prior to being opened. Assuming these files are
on the server, getting the server to do the processing will avoid this
transfer, but increase the load on the server. Only you know if this is
acceptable.
You can use CreateObject to instantiate an object on a non-local machine.
The COM call across process/network will be expensive, but hardly much
compared to transferring 2000 file. The final report can then be sent as a
single file.

However, if you have that much data to process, a DB management scheme may
be more practical, where internal queries/stored procedures can be used.

Alternatively, ADO and/or Pivot tables may be included.

NickHK

"Jac Tremblay" wrote in message
...
Hi,
We have a serious performance problem with an Excel application when it is
used on a satellite station (any one of the many there are) but there is

no
problem when the same application is used on the central server. The WAN

is
configured properly but the data to be read is always increasing and I'm
trying to find a solution to it, asap.
The problem appears because the Excel application has to open between 1500
and 2000 excel files to produce a report. This task necessarily takes a

long
time to execute because of the 256 or 512 K band width that is used.
I thought of a possible solution but wander if it possible and if it would
be good enough as performance gain. Here is my idea:
I would split the application in two and place the code that opens the

2000
Excel files in a second Excel application on the central server. Then, I
would use the first application from the satellite site and call the

process
(the second Excel application) that resides on the central server to do

the
dirty job directly on that central server where performance is acceptable.
With such a set up, will the process be faster? Will the server memory be
used or will the process be executed on the satellite site using this
station's memory?
Does any one knows?
Any ideas?
Thank.
--
Jac Tremblay



  #3   Report Post  
Posted to microsoft.public.excel.programming
external usenet poster
 
Posts: 99
Default Server process

Hi NickHK,
The system is already built and running. I did not participate in the design
nor in the development phases. I'm just here to help find a solution. I have
many ideas. Some are good, other are costly and some are both.
The client has no money for that project but will be obliged to do something
because the situation is getting worse every day.
The files have been copied a few at a time and nobody thought that there
would be so many of them after such a short period of time.
I know that with ASP, one can create a server process or a client process
for performance and other reasons.
I just want to know if that is possible with Excel (refer to the OP).
Thanks for your answer. It is very appreciated.

--
Jac Tremblay


"NickHK" wrote:

Jac,
I'm no network admin, but here's my 2c..

Running from the workstation obviously means that the 2000 files have be
transferred over the network prior to being opened. Assuming these files are
on the server, getting the server to do the processing will avoid this
transfer, but increase the load on the server. Only you know if this is
acceptable.
You can use CreateObject to instantiate an object on a non-local machine.
The COM call across process/network will be expensive, but hardly much
compared to transferring 2000 file. The final report can then be sent as a
single file.

However, if you have that much data to process, a DB management scheme may
be more practical, where internal queries/stored procedures can be used.

Alternatively, ADO and/or Pivot tables may be included.

NickHK

"Jac Tremblay" wrote in message
...
Hi,
We have a serious performance problem with an Excel application when it is
used on a satellite station (any one of the many there are) but there is

no
problem when the same application is used on the central server. The WAN

is
configured properly but the data to be read is always increasing and I'm
trying to find a solution to it, asap.
The problem appears because the Excel application has to open between 1500
and 2000 excel files to produce a report. This task necessarily takes a

long
time to execute because of the 256 or 512 K band width that is used.
I thought of a possible solution but wander if it possible and if it would
be good enough as performance gain. Here is my idea:
I would split the application in two and place the code that opens the

2000
Excel files in a second Excel application on the central server. Then, I
would use the first application from the satellite site and call the

process
(the second Excel application) that resides on the central server to do

the
dirty job directly on that central server where performance is acceptable.
With such a set up, will the process be faster? Will the server memory be
used or will the process be executed on the satellite site using this
station's memory?
Does any one knows?
Any ideas?
Thank.
--
Jac Tremblay




Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
SQL Server -- Bulk Insert from Excel to SQL Server Madhan Excel Discussion (Misc queries) 0 December 12th 06 03:08 PM
Excel 2003 Error Cannot Locate the Internet Server or Proxy Server Seabee Excel Discussion (Misc queries) 0 November 20th 05 12:03 AM
problem updating link from Novell server to windows server #REF er Ellen Excel Discussion (Misc queries) 0 May 10th 05 09:18 PM
How to count process running time ( process not finished) miao jie Excel Programming 0 January 13th 05 09:23 AM
How to count process running time ( process not finished) miao jie Excel Programming 2 January 12th 05 06:01 AM


All times are GMT +1. The time now is 09:15 AM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 ExcelBanter.
The comments are property of their posters.
 

About Us

"It's about Microsoft Excel"