I have a DataReader. I do not import it.
Rather, I assign it to a LightCellsDataProvider implementation, so that in override of NextRow() I read another row from the DataReader.
Then, I call Save() saving to the Response.
Two problems:
-
With Data of about a million rows, I get an OutOfMemory exception at about row 986,000. Note that I got an OutOfMemory exception at about row 772,000 before implementing LightCellsDataProvider.
-
Even though I’m saving to the Response stream, the Save File dialog does not pop up in the client browser until all rows are read from the reader. I tried passing the Response to my LIghtCellsDataProvider implementation and calling Response.Output.Flush and Response.Flush in the NextRow() override method, but that doesn’t change the situation.
With fewer rows, I don’t get an OutOfMemory exception, and I do get the valid file. So my implentation of LIghtCellsDataProvider is correct.
To overcome #1, I have changed to save to a temp file on disk, and then use Response.TransmitFile. This works - i.e., I overcome the OutOfMemory exception. However I still have problem #2, that the Save File dialog does not open in the client browser for a long duration.
To overcome problem #2, I can implement a spinner as an indication they should be patient and wait for the file.
However, it seems to me that if I have a DataReader and a LightCellsDataProvider implementation that reads from that DataReader, and I am saving to the ResponseStream, then I shouldn’t have OutOfMemory issues. It seems that Cells is actually still loading all of the data before starting to write to the Response. I don’t need to read the rows from Cells after they are loaded into the Workbook, I simply need to save the workbook out to the Response stream. And, if Response.Flush() were called multiple times during Save(), then it should trigger the client browser to open the Save File dialog after enough bytes have been received.
Or, is it possible I’m missing some better way of doing this?
Thanks!
ChssAddct