PDA

View Full Version : Solved: Practical limit to size of array?



skulakowski
01-10-2006, 12:28 PM
What's the practical limit to the size of an array? I'm trying to set up an array of 380,000 rows by 148 columns (basically 6 sheets by 64,000 plus a few more on the 7th sheet). My Windows XP Professional PC is (supposed to be) very smart with 2 gig of RAM. How can I know how much memory is available? is in use? Are there other memory options that I can change to make this array work?

Sub GetChartData()
Dim SheetCount As Double
Dim SheetNames() As Variant
Dim i As Double

SheetCount = ActiveWorkbook.Sheets.Count
SheetCount = SheetCount - 1 'change number of extra sheets here
ReDim SheetNames(1 To SheetCount, 0 To 2) As Variant
TotalLoanCount = 0

For i = 1 To SheetCount 'for each sheet, count records and running total of records
SheetNames(i, 0) = ActiveWorkbook.Sheets(i).Name

Worksheets(SheetNames(i, 0)).Select
Range("A2").Select
Selection.End(xlDown).Select
SheetNames(i, 1) = ActiveCell.Row - 1

If i > 1 Then SheetNames(i, 2) = SheetNames(i, 1) + SheetNames(i - 1, 2) Else SheetNames(i, 2) = SheetNames(i, 1)
Range("A2").Select
Next i

TotalLoanCount = SheetNames(i - 1, 2)

Dim ChartingArray() As Double
Dim qRow As Double
Dim qOtherRow As Double
Dim p As Double
Dim n As Double
Dim StartRow As Double
Dim EndRow As Double

For i = 1 To SheetCount
If i = 1 Then
ReDim ChartingArray(0 To TotalLoanCount - 1, 0 To 147) As Double
StartRow = 0
Else: StartRow = SheetNames(i - 1, 2)
End If

etc. etc. etc.
next i


At the "ReDim ChartingArray(0 To TotalLoanCount - 1, 0 To 147) As Double" command, I get "run-time error 7, out of memory".

Anybody have any ideas? The ChartingArray data is really three sets of 48 plus three other fields. Only thing that I can think of is to run three passes through the data, handling each pass independently.http://vbaexpress.com/forum/images/smilies/banghead.gif

Killian
01-11-2006, 05:22 AM
Well each variable of type double takes 8 bytes, so a 2 dimensional array, 380,000 x 148 will require around 429Mb. To say this is excessive is something of an understatement.
Further, redimensioning arrays is a slow and resource hungry process in VB and is really only appropriate for relatively small arrays in certain circumstances.
To populate this array, you need to reference the data elements from the workbook with the three indexes, sheet, row & column. It would probably be more efficient to do this directly rather than replicate the data in an array.

However, a single instance of your dataset is 148 x 8bytes. A little over 1kb. If you have a few things to do with the data once you've got it I would suggest creating a data structure (a 1d array, User-defined Type or Class) and recycle this 380,000 times.

skulakowski
01-11-2006, 08:52 AM
Killian, thanks. I can process each sheet individually. (Don't know why that didn't occur to me!)

To process the records individually, I set up one-dimensional arrays, processed the first record, wrote out its results, and then picked up the next record into the same arrays.

Now I'm reporting on the whole dataset. The simple way I saw was to grab them all. But I can calculate the reporting results for one sheet, remember them, and then move on to the next sheet.

While I expect my users to have sets of less than 200,000, I figured that a testing sample of 400,000 would show data stupidities early, allowing me to identify (most of) them (each user is unique in his own way), and if the records could be processed fast enough, my users would be happy.

One day, I'm going to be a real programmer... I may even learn how to create data structures.

Thanks again.
Susan