ExcelBanter

ExcelBanter (https://www.excelbanter.com/)
-   Excel Programming (https://www.excelbanter.com/excel-programming/)
-   -   VSB more efficient then standard functions? (https://www.excelbanter.com/excel-programming/417095-vsb-more-efficient-then-standard-functions.html)

Jan

VSB more efficient then standard functions?
 
hi,

i have linked 1 worksheet to 2 other worksheets (all 3 in the same file)
using VLOOKUP

this in total pulls over 3 different cells into 3 different columns for all
65536 rows.
a update will take 30 minutes. i assume thsi could be more efficient,
however i don't know how to do that. any suggestions?

thanks.

Gary''s Student

VSB more efficient then standard functions?
 
65536 seems a little excessive. Can you limit the tables to less than this
value??
--
Gary''s Student - gsnu200805


"jan" wrote:

hi,

i have linked 1 worksheet to 2 other worksheets (all 3 in the same file)
using VLOOKUP

this in total pulls over 3 different cells into 3 different columns for all
65536 rows.
a update will take 30 minutes. i assume thsi could be more efficient,
however i don't know how to do that. any suggestions?

thanks.


Jim Thomlinson

VSB more efficient then standard functions?
 
Trying to relate that that many cells is a problem for XL. One problem is you
blow through the 65,536 dependancy tree limit making all formulas act
volatile. The other is just the sheer number of calculations is overwhelming.
VBA or VBS is not going to help you in this case as they will operate slower
than vlookup. Your best option would probably be to use a relational database
which is designed for exactly this purpose.

Here is a link to calculation and general speed and memory performance...
http://www.decisionmodels.com/index.htm
--
HTH...

Jim Thomlinson


"jan" wrote:

hi,

i have linked 1 worksheet to 2 other worksheets (all 3 in the same file)
using VLOOKUP

this in total pulls over 3 different cells into 3 different columns for all
65536 rows.
a update will take 30 minutes. i assume thsi could be more efficient,
however i don't know how to do that. any suggestions?

thanks.


Jan

VSB more efficient then standard functions?
 
i might but still require a large number of rows. i use this to track/report
time entries from a large group of people over time.




"Gary''s Student" wrote:

65536 seems a little excessive. Can you limit the tables to less than this
value??
--
Gary''s Student - gsnu200805


"jan" wrote:

hi,

i have linked 1 worksheet to 2 other worksheets (all 3 in the same file)
using VLOOKUP

this in total pulls over 3 different cells into 3 different columns for all
65536 rows.
a update will take 30 minutes. i assume thsi could be more efficient,
however i don't know how to do that. any suggestions?

thanks.


Jan

VSB more efficient then standard functions?
 
Jim,

is there a way to identify the last record in a worksheet, in that case i
see possibilities to limit it to those.

"Jim Thomlinson" wrote:

Trying to relate that that many cells is a problem for XL. One problem is you
blow through the 65,536 dependancy tree limit making all formulas act
volatile. The other is just the sheer number of calculations is overwhelming.
VBA or VBS is not going to help you in this case as they will operate slower
than vlookup. Your best option would probably be to use a relational database
which is designed for exactly this purpose.

Here is a link to calculation and general speed and memory performance...
http://www.decisionmodels.com/index.htm
--
HTH...

Jim Thomlinson


"jan" wrote:

hi,

i have linked 1 worksheet to 2 other worksheets (all 3 in the same file)
using VLOOKUP

this in total pulls over 3 different cells into 3 different columns for all
65536 rows.
a update will take 30 minutes. i assume thsi could be more efficient,
however i don't know how to do that. any suggestions?

thanks.


Jim Thomlinson

VSB more efficient then standard functions?
 
You can use a dynamic named range. Keep in mind that it uses the offest
function which is volatile...

http://www.cpearson.com/excel/excelF.htm#DynamicRanges
--
HTH...

Jim Thomlinson


"jan" wrote:

Jim,

is there a way to identify the last record in a worksheet, in that case i
see possibilities to limit it to those.

"Jim Thomlinson" wrote:

Trying to relate that that many cells is a problem for XL. One problem is you
blow through the 65,536 dependancy tree limit making all formulas act
volatile. The other is just the sheer number of calculations is overwhelming.
VBA or VBS is not going to help you in this case as they will operate slower
than vlookup. Your best option would probably be to use a relational database
which is designed for exactly this purpose.

Here is a link to calculation and general speed and memory performance...
http://www.decisionmodels.com/index.htm
--
HTH...

Jim Thomlinson


"jan" wrote:

hi,

i have linked 1 worksheet to 2 other worksheets (all 3 in the same file)
using VLOOKUP

this in total pulls over 3 different cells into 3 different columns for all
65536 rows.
a update will take 30 minutes. i assume thsi could be more efficient,
however i don't know how to do that. any suggestions?

thanks.


Bob Phillips[_3_]

VSB more efficient then standard functions?
 
Rather than use 65536, use a method to determine the last used row, such as

LastRow = Cells(Rows.Count, "A").End(xlUp).Row

and use that to determine the size of the data to be processed.

--
__________________________________
HTH

Bob

"jan" wrote in message
...
hi,

i have linked 1 worksheet to 2 other worksheets (all 3 in the same file)
using VLOOKUP

this in total pulls over 3 different cells into 3 different columns for
all
65536 rows.
a update will take 30 minutes. i assume thsi could be more efficient,
however i don't know how to do that. any suggestions?

thanks.





All times are GMT +1. The time now is 01:34 AM.

Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
ExcelBanter.com