We're sorry Aspose doesn't work properply without JavaScript enabled.

Free Support Forum - aspose.com

Cannot access a closed Stream Concatenating PDFs

Hi,


I’m getting a “Cannot access a closed Stream.” in the code below when I try to save to a stream. I am simply trying to concatenate pdfs in a loop.

I am using version 9.8.0.0

Any help is appreciated.
Thanks.

using (var stream = new MemoryStream())
{
// Start by opening first pdf
using (var pdfDoc = new Document(pdfFiles[0]))
{
// Starting with second, concat all the rest
for (int i = 1; i < pdfFiles.Count; i++)
{
using (var nextPdf = new Document(pdfFiles[i]))
{
pdfDoc.Pages.Add(nextPdf.Pages);
}
}
pdfDoc.Save(stream); // Error: Cannot access a closed Stream.
}
return new FileContentResult(stream.ToArray(), “application/pdf”);
}

Hi John,


Thanks for your inquiry. It seems API looking for merged files(nextPdf object) at the time of saving final document(pdfDoc) and throwing the exception. I have logged a ticket PDFNEWNET-37879 for further investigation and resolution. We will keep you updated about the issue resolution progress. Meanwhile you can use following code snippet without second using block.

using (var stream = new MemoryStream())<o:p></o:p>

{

// Start by opening first pdf

using (var pdfDoc = new Document(pdfFiles[0]))

{

var nextPdf = new Document();

// Starting with second, concat all the rest

for (int i = 1; i < pdfFiles.Count; i++)

{

nextPdf = new Document(pdfFiles[i]);

pdfDoc.Pages.Add(nextPdf.Pages);

}

pdfDoc.Save(stream); // Error: Cannot access a closed Stream.

nextPdf.Dispose();

}

return new FileContentResult(stream.ToArray(), "application/pdf");

}

We are sorry for the inconvenience caused.


Best Regards,

Thanks, but doesn’t this cause the problem I was trying to prevent (by utilizing the using construct)?


After you create a new Document, when does Dispose get called? You are only calling Dispose on the last instance of nextPdf–what about all the others?

Hi John,


Thanks for your feedback. I have just shared a workaround for time being, However we will keep you updated about the progress of above logged issue.

We are sorry for the inconvenience caused.

Best regards,