Our server with 2GB of heap allocated is crashing when loading a presentation using new Presentation() of 50MB on the following line:
Presentation doc = new Presentation(<SourceFilePath>);
We would like to know two things:
- Is there a good way to gauge what size of a file can be converted with a particular amount of Heap space?
- Is there a way to optimize a pptx slide deck so that a large number of slides 50+ can be converted?
Here is the line that’s crashing the server:
Erroring Code.PNG (232.1 KB)
Thank you for contacting support.
Unfortunately, there is no way to know what presentation size will be used on the heap.
For large presentations, you should try to restrict the memory usage for loading like this:
LoadOptions loadOptions = new LoadOptions();
loadOptions.getBlobManagementOptions().setMaxBlobsBytesInMemory(128 * 1024 * 1024); // 128 MB, for example
Presentation presentation = new Presentation("MockPowerPoint.pptx", loadOptions);
If you generate presentations adding large objects (images, video, etc.), you can also restrict the memory usage for them.
Documents: Open Large Presentation, Manage Blob
API Reference: IBlobManagementOptions Interface
We have tried to update the loadOptions as you’ve mentioned, but saw that the presentation was still crashing right around a 15MB size.
Are there any other loadOptions that deal with images?
Please try using zero value for the
setMaxBlobsBytesInMemory method. If the issue persists, please share the following data:
- input presentation files
- simple project that reproduces the problem
- OS version on your server
Unfortunately, there are only ways to affect this process described in the articles mentioned above.