Home |
Search |
Today's Posts |
#1
Posted to microsoft.public.excel.programming
|
|||
|
|||
Memory Problem
I keep getting an out of memory error (error 7) when I run this.
Dim DataArray As Variant Din AMax As Long Dim BMax As Long Dim CMax As Long AMax = 65000 BMax = 200 CMax = 1 ReDim DataArray(1 To AMax, 1 To BMax, 1 To CMax) Am I simply running too many loops for excel to handle? --- Message posted from http://www.ExcelForum.com/ |
#2
Posted to microsoft.public.excel.programming
|
|||
|
|||
Memory Problem
I don't think Excel is the problem. 65,000 is a lot of elements and o
top of that you're doing it 200 times (13,000,000 total elements i your array!). Take a look at how much memory you have in the firs place and how many applications you have running. Try cutting tha number in half and see how it works and then try to determine exactl what the magic number is that makes this happen. Also, because CMAX = 1, the array ReDim DataArray(1 To AMax, 1 To BMax is a simpler way to do the exact same thing. - Piku -- Message posted from http://www.ExcelForum.com |
#3
Posted to microsoft.public.excel.programming
|
|||
|
|||
Memory Problem
What version of Excel are you using? You're probably running up
against Excel's own memory limitations. See http://www.decisionmodels.com/memlimitsc.htm for more detailed information. You're allocating 13,000,000 elements in the array, each a variant. A variant requires 22 bytes of memory plus memory to hold the contents of the variant, so right out of the box you're allocating about 286MB. Once you start putting values in the array, you'll go even higher. -- Cordially, Chip Pearson Microsoft MVP - Excel Pearson Software Consulting, LLC www.cpearson.com "ExcelMonkey " wrote in message ... I keep getting an out of memory error (error 7) when I run this. Dim DataArray As Variant Din AMax As Long Dim BMax As Long Dim CMax As Long AMax = 65000 BMax = 200 CMax = 1 ReDim DataArray(1 To AMax, 1 To BMax, 1 To CMax) Am I simply running too many loops for excel to handle? --- Message posted from http://www.ExcelForum.com/ |
#4
Posted to microsoft.public.excel.programming
|
|||
|
|||
Memory Problem
Do you realize what you are doing? You are creating a multi-dimensional array 65000x200x1
Ignore the 1 What is 65,000 to the power of 200? Right, a really big number. This has nothing to do with loops. It has everything to do with memory allocation, and I don't care how much you spend on RAM SIMMs you're not going to get there. Now tell us what you're really trying to accomplish because this is never going to work. |
#5
Posted to microsoft.public.excel.programming
|
|||
|
|||
Memory Problem
You will need up upgrade your computer to do that. I have 3.7Gh
(overclocked) with 1GB Ram and I can do it very easily, it does use u about 20% of my Ram right off the bat. I tried including 2 screenshot so you can look and see what it is doing in your ram but it is too big email me if you want the pics. keith www.kjtfs.co -- Message posted from http://www.ExcelForum.com |
#6
Posted to microsoft.public.excel.programming
|
|||
|
|||
Memory Problem
Not quite 65,000 to the power 200, more like 65,000 x 200 - still a big
number and your going to break something I suspect! "medialint" wrote in message ... Do you realize what you are doing? You are creating a multi-dimensional array 65000x200x1 Ignore the 1 What is 65,000 to the power of 200? Right, a really big number. This has nothing to do with loops. It has everything to do with memory allocation, and I don't care how much you spend on RAM SIMMs you're not going to get there. Now tell us what you're really trying to accomplish because this is never going to work. ----== Posted via Newsfeed.Com - Unlimited-Uncensored-Secure Usenet News==---- http://www.newsfeed.com The #1 Newsgroup Service in the World! 100,000 Newsgroups ---= 19 East/West-Coast Specialized Servers - Total Privacy via Encryption =--- |
#7
Posted to microsoft.public.excel.programming
|
|||
|
|||
Memory Problem
Yes I support the above theory of x*y elements. Ex, a 5 by 2 array
(will will say base 1!!) 1,1 1,2 2,1 2,2 3,1 3,2 4,1 4,2 5,1 5,2 that is 10 elements not 25. 5*2 = 10 So the element growth is linear not exponential but it still adds up quick. Keith www.kjtfs.com --- Message posted from http://www.ExcelForum.com/ |
#8
Posted to microsoft.public.excel.programming
|
|||
|
|||
Memory Problem
sorry for the hiatus. I am looking to forecast hourly data over 15 or
more years. Ignoring leaps years gives me 8670 hours per year. 8670*15 years = 131,400 elements. I then want to do this stochastically. Thus my 1D array will need to be 131,400 elements, and my 2D array will need to incorporate the stochastic runs (i.e. 1,000 or so). I justed used 65,000 becuase I was going to print the array data to excel to see if I was doing it right. If 65,000 by 1,000 doesn't work then I do not know how I will get around needing 131,400 by 1,000. When you use hourly data over large time period, excel just doesn't cut it due to its memory issues. Models get bogged down due to large amount of data. I was hoping that VBA would be the answer for this. --- Message posted from http://www.ExcelForum.com/ |
#9
Posted to microsoft.public.excel.programming
|
|||
|
|||
Memory Problem
For that amount of data you will need a database or even better sometyp
of programmed data structure might be more efficient, like a linke list or something like that. Keith www.kjtfs.co -- Message posted from http://www.ExcelForum.com |
#10
Posted to microsoft.public.excel.programming
|
|||
|
|||
Memory Problem
Consider uing SPSS as a data exploratory tool for this type of analysis
http://www.spss.com Cheers Nigel "ExcelMonkey " wrote in message ... sorry for the hiatus. I am looking to forecast hourly data over 15 or more years. Ignoring leaps years gives me 8670 hours per year. 8670*15 years = 131,400 elements. I then want to do this stochastically. Thus my 1D array will need to be 131,400 elements, and my 2D array will need to incorporate the stochastic runs (i.e. 1,000 or so). I justed used 65,000 becuase I was going to print the array data to excel to see if I was doing it right. If 65,000 by 1,000 doesn't work then I do not know how I will get around needing 131,400 by 1,000. When you use hourly data over large time period, excel just doesn't cut it due to its memory issues. Models get bogged down due to large amount of data. I was hoping that VBA would be the answer for this. --- Message posted from http://www.ExcelForum.com/ ----== Posted via Newsfeed.Com - Unlimited-Uncensored-Secure Usenet News==---- http://www.newsfeed.com The #1 Newsgroup Service in the World! 100,000 Newsgroups ---= 19 East/West-Coast Specialized Servers - Total Privacy via Encryption =--- |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Forum | |||
memory problem in Excel 2007 | Excel Discussion (Misc queries) | |||
Memory problem | Excel Programming | |||
Excel memory problem | Excel Programming | |||
Freeing memory problem | Excel Programming | |||
Memory/Font problem | Excel Programming |