View Single Post
  #2   Report Post  
Posted to microsoft.public.excel.programming
Ronald Dodge[_2_] Ronald Dodge[_2_] is offline
external usenet poster
 
Posts: 130
Default Exceeding 65K of defined names within workbook causes workbook to go into repair mode when it is opened

Even though I have used range names before in the past, nothing to this
extent until this past March when I first started to use range names
extensively only to have ran into this limit so fast.

As for the collection count property that I asked, it just made me wonder if
the count property has been just assigned an unsigned integer variable data
type, which is what I have faced in particular to the Names collection
object. Rather if it's signed or not, it's still 4 bytes and you still only
can get 65536 different ID values. I haven't had anything else to run into
this 4 byte addressing limitation, but it wouldn't surprise me. I have done
some extensive programming including created my own objects and collections
of objects, but never really tested to see what is the maximum number of
items those collections could hold. Of course, given that a single
VBProject can only hold up to 64MB of RAM usage, this limitation may not be
hit as easily.

As for the recalc you mention, the workbook is set to manual calculation and
the instance of Excel it opens into is also set to manual calculation, so
the repair isn't even triggered by a calculation factor. For that set of
files, I mainly use VBA to control the calculations of the various workbooks
and worksheets. By using VBA to control the calculations, it has saved a
lot of processing time as I have been able to cut down an extraordinary
amount of redundant calculations by using VBA to control the calculations
rather than just allowing Excel to do the calculations how it would do it
under AutoCalc or a full forced recalc.

Ronald R. Dodge, Jr.
Master MOUS 2000

"Peter T" <peter_t@discussions wrote in message
...
I have no idea as to the specific problem for the apparent Names limit
you've hit, never having tested with anything like that many. However I'm
pretty sure there's not an inherent 64k limit to collections in general.
In
theory the count should extent to a Long though would undoubtedly hit
memory
constraints before getting any where close to that. However specific
collections may be limited due to other factors.

The 64k rows limit affects quite a few things in Excel, one is the array
limit which is actually one less (I don't just mean array entered
formulas,
but certain array type functions). Anything like that referencing your
names
perhaps, which might only become apparent when a full recalc occurs on
open.
I know not related but there are also 64k limits with certain things in
W9x
and ME with certain API's (I got bit by one recently).

Now that I seemed to have found the real limit of defined names, it's now
time to start thinking in other directions to address the issues that I

been
facing.


Sounds well overdue!

Regards,
Peter T



"Ronald Dodge" wrote in message
...
Okay, I'm at it again, but only with further testing on this defined
names
issue. I have seen different posts stating that with excessive number of
range names, it causes the files to process significantly slower, which I
have seen no such symptom within my own files. However, I have noticed a
certain pattern and now I must raise another question that I seem to have
noticed?

What is the maximum number of items any one collection can have without
causing issues?

Why do I ask this?

It seems to me that the collection is using 4 bytes as addresses to get
an
index number. If this is the case, then when an item exceeds 65536
different items, you no longer have a unique ID number, which is what

seems
to be causing me these issues that I seem to have ran into. This 65536
number also made me think of rows instantly, which then makes me wonder
if
this is also the reason why MS has made 65536 as the maximum number of

rows
one can have within a single worksheet.

What had me think in this sort of direction?

Remember the early Windows 95 edition, and you could only have up to 2GB

of
HD Space on a per letter drive basis, thus if you had a larger drive
size,
it had to be partitioned into multiple partitions to be able to use the
space on it? Well that issue related back to the fact that the original

FAT
filing system could only handle up to the 2GB of space per
drive/partition
for addressing purposes.

In this case, if the Names collection has more than the 4 bytes of
indexed
values, the workbook goes into repair mode when it is opened, which then
makes the workbook almost useless as it stripes a lot of stuff from it.
While the application specifications says it's limited to the amount of

RAM
on the system, I'm no where near reaching this limit. Not only that, but
supposedly, Excel 2002 should be able to handle up to about 160MB of RAM
usage, which according to the task manager, when I ran into this issue,
it
was only using 67MB of RAM on a system that has 512MB of RAM. According

to
the Task Manager, I still have 130MB of RAM usage currently not in use
and
available to use along with 768MB of swap file available to use for a

total
working memory usage availability being 1.28GB, which only up to 853MB of
that has been used.

Now that I seemed to have found the real limit of defined names, it's now
time to start thinking in other directions to address the issues that I

been
facing. I would dare to venture to claim that the specification limit of
defined names is in deed 65536 just like the maximum number of rows is
65536. The "Cells" object has a 2 byte by a 1 byte index method, similar

to
a 2 dimensional array. To help me find out the issues, I used the watch
window to look at the count value on the Names collection object and
after
several testings and watching that value along with seeing other things,
it's what has had me draw that conclusion.
--

Sincerely,

Ronald R. Dodge, Jr.
Master MOUS 2000