Large heap dump files while converting pdf to tiff

Hi,
i have a dockerized spring boot application that i use as an API to convert various file formats (pdf, word, excel, images) to tiff.
I can connect inside the docker filesystem to create a heap dump file for the spring boot app process.
The heap dump file just after running the docker-compose up command, is about 700mb.
However, the heap dump files created while the pdf-to-tiff conversion process is in progress can reach the size of 1.9gb (i used a pdf file of 20-30 pages).
Is this expected when running a pdf to tiff converter?
If not, maybe i can send you the heap dump file to analyze it.

PS I would like to know if there are any clean-up methods that i should execute after the conversion is done. However, i need to know if these methods are thread safe because we are going to convert files in parallel by calling the spring boot endpoints

@stsakas

In case you are using old version of Aspose.PDF, we suggest you please upgrade to the latest version of Aspose.PDF for Java 22.3. If you still face problem, please share following detail here for testing:

  • The input PDF along with screenshot of heap dump file.
  • Please also share dockerized spring boot application (source code without compilation errors) here for testing.
  • Please share the working environment e.g. operating system, Java version etc.

As soon as you get these pieces of information ready, we will start investigation into your issue and provide you more information. Thanks for your cooperation.

PS: To attach these resources, please zip and upload them.