Cannot access a closed Stream Concatenating PDFs

Hi,


I’m getting a “Cannot access a closed Stream.” in the code below when I try to save to a stream. I am simply trying to concatenate pdfs in a loop.

I am using version 9.8.0.0

Any help is appreciated.
Thanks.

using (var stream = new MemoryStream())
{
// Start by opening first pdf
using (var pdfDoc = new Document(pdfFiles[0]))
{
// Starting with second, concat all the rest
for (int i = 1; i < pdfFiles.Count; i++)
{
using (var nextPdf = new Document(pdfFiles[i]))
{
pdfDoc.Pages.Add(nextPdf.Pages);
}
}
pdfDoc.Save(stream); // Error: Cannot access a closed Stream.
}
return new FileContentResult(stream.ToArray(), “application/pdf”);
}

Hi John,

Thanks for your inquiry. It seems the API is looking for merged files (nextPdf object) at the time of saving the final document (pdfDoc) and throwing the exception. I have logged a ticket PDFNEWNET-37879 for further investigation and resolution. We will keep you updated about the issue resolution progress. Meanwhile, you can use the following code snippet without the second using block.

using (var stream = new MemoryStream())
{
    // Start by opening the first PDF
    using (var pdfDoc = new Document(pdfFiles[0]))
    {
        var nextPdf = new Document();
        // Starting with the second, concatenate all the rest
        for (int i = 1; i < pdfFiles.Count; i++)
        {
            nextPdf = new Document(pdfFiles[i]);
            pdfDoc.Pages.Add(nextPdf.Pages);
            pdfDoc.Save(stream); // Error: Cannot access a closed Stream.
            nextPdf.Dispose();
        }
        return new FileContentResult(stream.ToArray(), "application/pdf");
    }
}

We are sorry for the inconvenience caused.

Best Regards,

Thanks, but doesn’t this cause the problem I was trying to prevent (by utilizing the using construct)?


After you create a new Document, when does Dispose get called? You are only calling Dispose on the last instance of nextPdf–what about all the others?

Hi John,


Thanks for your feedback. I have just shared a workaround for time being, However we will keep you updated about the progress of above logged issue.

We are sorry for the inconvenience caused.

Best regards,