We're sorry Aspose doesn't work properply without JavaScript enabled.

Free Support Forum - aspose.com

Save 1.000.000 rows with LightCells column by column

Hi everyone,

My goal is to save a very larger amount of data (something around 1.000.000 rows) in one excel file. I read a lot about this problem and it seems like LightCellsDataProvider is a correct path for me. I tested it and it really works very fast, and does not take much memory. Only problem that I have now is…how to store data column by column and not row by row… With simple implementation (no LightCells) I can just do a loop in loop like that:

 for (int j = 0; j < rowCount; j++){
     for (int i = 0; i < 15; i++){
        Cell cell = cells.get(j, i);
        cell.setValue("Test"+counter++);
     }
 }

And decide what order I want to choose - e.g. fill in column by column. Now with LightCellsDataProvider I think that a order of filling data is fixed - row by row, am I right ? I added some loggin into the LightCellsDataProvider so I can see the order of calls, and it tells me that I am right:

startSheet
nextRow
startRow
nextCell
startCell
nextCell
startCell
nextRow

My question is : it there a way to use LightCells and store data column by column and not row by row ?? It is very important in my project to do it in this order. BTW I am using JAVA and AsposeCells 7.3.1

Any information will be very helpfull…

cheers

@pawelforums,

Thanks for your query with details.

Well, I am afraid, this is impossible in light mode because the data in Excel file is saved in row by row order, so we cannot send the data to the file in order of column by column.
If you want to fill data in order of column by column, you have to use normal mode to build all data of cells in memory. We think as you mainly want to reduce the memory cost with your special order, there is no better way to cope with it. We recommend you to kindly try MemoryPreference mode, see the document for your reference:
https://docs.aspose.com/display/cellsjava/Optimizing+Memory+Usage+while+Working+with+Big+Files+having+Large+Datasets

Hope, this helps a bit.

Ok, thank you very much for your time, and the answer