Aegis
07-09-2007, 01:44 PM
I've got the sub working basically how I want, but with one problem...some of the sequel queries will return results greater than 65,536 records long, and Excel's worksheets can only hold that many rows. Time requires using the existing excel solution. So, my question is, how can I ensure that I only try to dump 65,534 records into excel at once? I'm currently just using
ExcelWorkbook.sheets(1).range("A2").copyfromRecordset myRecordSet
I'd use Getrows(65534) if I knew 1) an array can get to that size, and 2) how to get the next 65,534 results. Any input as to how to limit the number of records copied would be helpful...perhaps a sql query to limit results or a way to make multiple tables of only 65.5k size...anything other than
while count < 65.5k
.....workbook.sheet.range = myRecordSet.row(count)
.....myRecordSet.movenext
.....count+=1
Wend
as that is unacceptably slow. :P
Oh yeah, and after the 65.5k limit is reached, the rest of the results need to be dumped into another worksheet, consecutively named, as junk1.xls, junk2.xls, etc.
ExcelWorkbook.sheets(1).range("A2").copyfromRecordset myRecordSet
I'd use Getrows(65534) if I knew 1) an array can get to that size, and 2) how to get the next 65,534 results. Any input as to how to limit the number of records copied would be helpful...perhaps a sql query to limit results or a way to make multiple tables of only 65.5k size...anything other than
while count < 65.5k
.....workbook.sheet.range = myRecordSet.row(count)
.....myRecordSet.movenext
.....count+=1
Wend
as that is unacceptably slow. :P
Oh yeah, and after the 65.5k limit is reached, the rest of the results need to be dumped into another worksheet, consecutively named, as junk1.xls, junk2.xls, etc.