Hi, I'll apologize for the lack of an explicit example for this problem up front. I have not been able to narrow down what exactly is causing this behavior.
I'm creating fairly large pdf files (10-15Mb on average). I load up a series of data objects write each one to the pdf as a table (15 rows avg) and save it to a memory stream. This works well for smaller files, but not with larger ones, so I'm attempting to refactor to use Direct-To-File mode.
What I've noticed is this:
* My pdf on the disk grows and grows, then it stops growing and the memory starts growing and it will never write to the file stream again.
* I've also noticed that this seems to be directly related to specific data objects that I am attempting to write. If I exclude those specific objects the pdf will write to disk properly. These problemed objects are no larger than any of the other objects that write properly.
My problem is that I am unable to determine what it is about the data in those objects that would cause this behavior. The files I'm testing with inevitably max out the memory available in a 32-bit process, so I'm not sure if it would ever finish or not.
Could you tell me:
* How and when does the Pdf determine it needs to flush data to the file? Calling "Section.AddParagraph" doesn't seem to flush the stream every time you call it.
* Is there anything outside of "new Pdf(Stream)" and "Section.AddParagraph" I need to be aware of when using Direct-To-File mode?
* Is there any explicit documentation or articles that would describe this mode in it's entirety and how to use it?
I've spent a fair amount of time trying to narrow down what is unique about my data that would cause this problem, and have been unsuccessful.
I appreciate any help you can give me. Thanks!