When i try to create a sheet with more than 200000 rows and 100 columns. I am getting Java Heap out of memory issue. I am using aspose cells version 24.7 with java 21.
You’re seeing an OutOfMemoryError when creating a worksheet with more than 200 000 rows and 100 columns because the default JVM heap is not large enough for that amount of data.
What you can try
- Increase the JVM heap size – start the JVM with a larger heap, for example:
A similar configuration was used in a discussion about large‑worksheet processing that helped avoid OOM [2].java -Xms16G -Xmx16G -XX:+UseG1GC -jar YourApp.jar - Confirm the new limits – after the workbook is created, print the heap statistics to verify the settings are applied:
System.out.println("Heap: " + Runtime.getRuntime().totalMemory() / (1024 * 1024) + " MB"); - Monitor memory usage – observe the heap while the workbook is being built. Other large‑workbook operations (e.g., PDF conversion) have been reported to consume several gigabytes of memory, so allocating a multi‑gigabyte heap is often necessary [3].
- Consider data size – if OOM persists even with a large heap, try reducing the worksheet size or processing the data in smaller batches.
Next step
Run the code with the increased heap and watch the memory values. If the error remains, please provide a minimal code snippet that reproduces the issue and the exact JVM flags you are using so we can look further into the problem.
@Vishal038
Please set memory setting as the following to reduce memory:
workbook.getSettings().setMemorySetting(MemorySetting.MEMORY_PREFERENCE// MemorySetting.FILE_CACHE //
);
Handling large datasets can be a common challenge, especially when attempting to create a sheet with more than 200,000 rows and 100 columns and encountering a Java heap out of memory error. Please note that the default approach loads all cell data into memory, which can be highly resource-intensive.
For your specific setup, we recommend try enabling memory optimization mode (as suggested by @simon.zhao). Alternatively, if the dataset is exceptionally large, you might consider using the LightCells API. If you are utilizing cells.get("A1").putValue(...), you could implement LightCellsDataProvider to stream rows to disk, which helps avoid loading all the millions of cells into RAM simultaneously. For further details, you may refer to the documentation here: Using LightCells API|Documentation
Additionally, we suggest increasing the JVM heap size to accommodate the memory requirements.
If the issue still persists, we would appreciate it if you could share a sample console Java application or program that reproduces the issue. This will allow us to investigate the problem more thoroughly and assist you further.