ExcelBanter

ExcelBanter (https://www.excelbanter.com/)
-   Excel Programming (https://www.excelbanter.com/excel-programming/)
-   -   Slow Calculating and optimizing (https://www.excelbanter.com/excel-programming/398155-slow-calculating-optimizing.html)

kurt

Slow Calculating and optimizing
 
Hi

I have been using the expression =SUMPRODUCT(--(file'!$P$1:$P
$65500="txt");'file'!$H$1:$H$65500)
to summarize all cells in column H where column P= "txt"

I have 1200 of theese expressions and it make the startup and
recalculating very slow (3-4 minutes)

The 65500 is choosed to be sure that I never ran out of Data and it
was not based on an analyze.

To optimize this cut the 65500 down to a relevant size ( ex. 10000)
and then the calculation time is reduced dramatically.

regards

Kurt


Roger Govier[_3_]

Slow Calculating and optimizing
 
Hi Kurt

Try creating dynamic ranges for your data
For example
Insertnamedefine Name colP Refers to =OFFSET($P$1,0,0,COUNTA($P:$P))

Repeat for colH

Then use
=SUMPRODUCT(--(colP=".txt");--(ColH))

To ensure that all of your ranges have the same length, use COUNTA($P:$P) in
all cases, or whichever column will not have any blank data within it

I usually prefer to have one named variable lrow where I would set it as
=COUNTA($P:$P)
Then all the other named ranges would be, for example
=OFFSET($P$1,0,0,lrow)

--
Regards
Roger Govier



"kurt" wrote in message
ps.com...
Hi

I have been using the expression =SUMPRODUCT(--(file'!$P$1:$P
$65500="txt");'file'!$H$1:$H$65500)
to summarize all cells in column H where column P= "txt"

I have 1200 of theese expressions and it make the startup and
recalculating very slow (3-4 minutes)

The 65500 is choosed to be sure that I never ran out of Data and it
was not based on an analyze.

To optimize this cut the 65500 down to a relevant size ( ex. 10000)
and then the calculation time is reduced dramatically.

regards

Kurt




Tom Ogilvy

Slow Calculating and optimizing
 
While Roger gives good advice for dynamic ranges, in this case, you don't
need to us sumproduct at all. Since you only have one condition, the much,
much faster

=Sumif('file'!P:P,"txt",'File'!H:H)

would work fast and doesn't restrict your range.

If 'file" will be closed, then Sumif does not work against a closed workbook
and you would have to go with Sumproduct and perhaps use Roger's suggestion.

--
Regards,
Tom Ogilvy


"Roger Govier" wrote:

Hi Kurt

Try creating dynamic ranges for your data
For example
Insertnamedefine Name colP Refers to =OFFSET($P$1,0,0,COUNTA($P:$P))

Repeat for colH

Then use
=SUMPRODUCT(--(colP=".txt");--(ColH))

To ensure that all of your ranges have the same length, use COUNTA($P:$P) in
all cases, or whichever column will not have any blank data within it

I usually prefer to have one named variable lrow where I would set it as
=COUNTA($P:$P)
Then all the other named ranges would be, for example
=OFFSET($P$1,0,0,lrow)

--
Regards
Roger Govier



"kurt" wrote in message
ps.com...
Hi

I have been using the expression =SUMPRODUCT(--(file'!$P$1:$P
$65500="txt");'file'!$H$1:$H$65500)
to summarize all cells in column H where column P= "txt"

I have 1200 of theese expressions and it make the startup and
recalculating very slow (3-4 minutes)

The 65500 is choosed to be sure that I never ran out of Data and it
was not based on an analyze.

To optimize this cut the 65500 down to a relevant size ( ex. 10000)
and then the calculation time is reduced dramatically.

regards

Kurt





Jim Thomlinson

Slow Calculating and optimizing
 
There is a trade off to be had if you use the dynamic named range in that
offset is a volatile function. That means that all 1,200 functions become
volatile and while they will calculate faster they will also calculate more
often. This strategy could work for you or against you depending on the time
savings. If it does not work out then you might want to consider using some
code to redefine the range names when then overall number of rows of source
data increases or decreases.
--
HTH...

Jim Thomlinson


"Roger Govier" wrote:

Hi Kurt

Try creating dynamic ranges for your data
For example
Insertnamedefine Name colP Refers to =OFFSET($P$1,0,0,COUNTA($P:$P))

Repeat for colH

Then use
=SUMPRODUCT(--(colP=".txt");--(ColH))

To ensure that all of your ranges have the same length, use COUNTA($P:$P) in
all cases, or whichever column will not have any blank data within it

I usually prefer to have one named variable lrow where I would set it as
=COUNTA($P:$P)
Then all the other named ranges would be, for example
=OFFSET($P$1,0,0,lrow)

--
Regards
Roger Govier



"kurt" wrote in message
ps.com...
Hi

I have been using the expression =SUMPRODUCT(--(file'!$P$1:$P
$65500="txt");'file'!$H$1:$H$65500)
to summarize all cells in column H where column P= "txt"

I have 1200 of theese expressions and it make the startup and
recalculating very slow (3-4 minutes)

The 65500 is choosed to be sure that I never ran out of Data and it
was not based on an analyze.

To optimize this cut the 65500 down to a relevant size ( ex. 10000)
and then the calculation time is reduced dramatically.

regards

Kurt





Roger Govier[_3_]

Slow Calculating and optimizing
 
Hi

Whilst I agree entirely with Tom's advice to the OP about SUMIF, and that
this would speed up the calculations far more, and with your point about
volatility using OFFSET, I thought after posting, I should have offered the
OP INDEX instead as a non-volatile solution.
I went out immediately after posting and have only just returned.

Again define lrow as per my previous posting =COUNTA($P:$P)
Then use the following for the column name colP
=$P$1:INDEX($P:$P,lrow)

--
Regards
Roger Govier



"Jim Thomlinson" wrote in message
...
There is a trade off to be had if you use the dynamic named range in that
offset is a volatile function. That means that all 1,200 functions become
volatile and while they will calculate faster they will also calculate
more
often. This strategy could work for you or against you depending on the
time
savings. If it does not work out then you might want to consider using
some
code to redefine the range names when then overall number of rows of
source
data increases or decreases.
--
HTH...

Jim Thomlinson


"Roger Govier" wrote:

Hi Kurt

Try creating dynamic ranges for your data
For example
Insertnamedefine Name colP Refers to
=OFFSET($P$1,0,0,COUNTA($P:$P))

Repeat for colH

Then use
=SUMPRODUCT(--(colP=".txt");--(ColH))

To ensure that all of your ranges have the same length, use COUNTA($P:$P)
in
all cases, or whichever column will not have any blank data within it

I usually prefer to have one named variable lrow where I would set it
as
=COUNTA($P:$P)
Then all the other named ranges would be, for example
=OFFSET($P$1,0,0,lrow)

--
Regards
Roger Govier



"kurt" wrote in message
ps.com...
Hi

I have been using the expression =SUMPRODUCT(--(file'!$P$1:$P
$65500="txt");'file'!$H$1:$H$65500)
to summarize all cells in column H where column P= "txt"

I have 1200 of theese expressions and it make the startup and
recalculating very slow (3-4 minutes)

The 65500 is choosed to be sure that I never ran out of Data and it
was not based on an analyze.

To optimize this cut the 65500 down to a relevant size ( ex. 10000)
and then the calculation time is reduced dramatically.

regards

Kurt








All times are GMT +1. The time now is 02:03 PM.

Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
ExcelBanter.com