Home |
Search |
Today's Posts |
#1
Posted to microsoft.public.excel.programming
|
|||
|
|||
Server process
Hi,
We have a serious performance problem with an Excel application when it is used on a satellite station (any one of the many there are) but there is no problem when the same application is used on the central server. The WAN is configured properly but the data to be read is always increasing and I'm trying to find a solution to it, asap. The problem appears because the Excel application has to open between 1500 and 2000 excel files to produce a report. This task necessarily takes a long time to execute because of the 256 or 512 K band width that is used. I thought of a possible solution but wander if it possible and if it would be good enough as performance gain. Here is my idea: I would split the application in two and place the code that opens the 2000 Excel files in a second Excel application on the central server. Then, I would use the first application from the satellite site and call the process (the second Excel application) that resides on the central server to do the dirty job directly on that central server where performance is acceptable. With such a set up, will the process be faster? Will the server memory be used or will the process be executed on the satellite site using this station's memory? Does any one knows? Any ideas? Thank. -- Jac Tremblay |
#2
Posted to microsoft.public.excel.programming
|
|||
|
|||
Server process
Jac,
I'm no network admin, but here's my 2c.. Running from the workstation obviously means that the 2000 files have be transferred over the network prior to being opened. Assuming these files are on the server, getting the server to do the processing will avoid this transfer, but increase the load on the server. Only you know if this is acceptable. You can use CreateObject to instantiate an object on a non-local machine. The COM call across process/network will be expensive, but hardly much compared to transferring 2000 file. The final report can then be sent as a single file. However, if you have that much data to process, a DB management scheme may be more practical, where internal queries/stored procedures can be used. Alternatively, ADO and/or Pivot tables may be included. NickHK "Jac Tremblay" wrote in message ... Hi, We have a serious performance problem with an Excel application when it is used on a satellite station (any one of the many there are) but there is no problem when the same application is used on the central server. The WAN is configured properly but the data to be read is always increasing and I'm trying to find a solution to it, asap. The problem appears because the Excel application has to open between 1500 and 2000 excel files to produce a report. This task necessarily takes a long time to execute because of the 256 or 512 K band width that is used. I thought of a possible solution but wander if it possible and if it would be good enough as performance gain. Here is my idea: I would split the application in two and place the code that opens the 2000 Excel files in a second Excel application on the central server. Then, I would use the first application from the satellite site and call the process (the second Excel application) that resides on the central server to do the dirty job directly on that central server where performance is acceptable. With such a set up, will the process be faster? Will the server memory be used or will the process be executed on the satellite site using this station's memory? Does any one knows? Any ideas? Thank. -- Jac Tremblay |
#3
Posted to microsoft.public.excel.programming
|
|||
|
|||
Server process
Hi NickHK,
The system is already built and running. I did not participate in the design nor in the development phases. I'm just here to help find a solution. I have many ideas. Some are good, other are costly and some are both. The client has no money for that project but will be obliged to do something because the situation is getting worse every day. The files have been copied a few at a time and nobody thought that there would be so many of them after such a short period of time. I know that with ASP, one can create a server process or a client process for performance and other reasons. I just want to know if that is possible with Excel (refer to the OP). Thanks for your answer. It is very appreciated. -- Jac Tremblay "NickHK" wrote: Jac, I'm no network admin, but here's my 2c.. Running from the workstation obviously means that the 2000 files have be transferred over the network prior to being opened. Assuming these files are on the server, getting the server to do the processing will avoid this transfer, but increase the load on the server. Only you know if this is acceptable. You can use CreateObject to instantiate an object on a non-local machine. The COM call across process/network will be expensive, but hardly much compared to transferring 2000 file. The final report can then be sent as a single file. However, if you have that much data to process, a DB management scheme may be more practical, where internal queries/stored procedures can be used. Alternatively, ADO and/or Pivot tables may be included. NickHK "Jac Tremblay" wrote in message ... Hi, We have a serious performance problem with an Excel application when it is used on a satellite station (any one of the many there are) but there is no problem when the same application is used on the central server. The WAN is configured properly but the data to be read is always increasing and I'm trying to find a solution to it, asap. The problem appears because the Excel application has to open between 1500 and 2000 excel files to produce a report. This task necessarily takes a long time to execute because of the 256 or 512 K band width that is used. I thought of a possible solution but wander if it possible and if it would be good enough as performance gain. Here is my idea: I would split the application in two and place the code that opens the 2000 Excel files in a second Excel application on the central server. Then, I would use the first application from the satellite site and call the process (the second Excel application) that resides on the central server to do the dirty job directly on that central server where performance is acceptable. With such a set up, will the process be faster? Will the server memory be used or will the process be executed on the satellite site using this station's memory? Does any one knows? Any ideas? Thank. -- Jac Tremblay |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Forum | |||
SQL Server -- Bulk Insert from Excel to SQL Server | Excel Discussion (Misc queries) | |||
Excel 2003 Error Cannot Locate the Internet Server or Proxy Server | Excel Discussion (Misc queries) | |||
problem updating link from Novell server to windows server #REF er | Excel Discussion (Misc queries) | |||
How to count process running time ( process not finished) | Excel Programming | |||
How to count process running time ( process not finished) | Excel Programming |