Free Support Forum -

Speed difference between ImportDataTable verses custom code

I am using grid desktop, and I am seeing a tremendous difference in performance when I use the built in ImportDataTable method verses me rendering the table contents in a loop. For a 2000 row, 16 column table, ImportDataTable does it very fast, under a second or two, but, unfortunately, the table contain columns that some users cannot see (i.e. role based), and I also need to do lookups in a dictionary data object on a column’s value. So I am forced to render the sheet rows in a loop, e.g.

for (int i = 0; i < dataTable.Rows.Count; i++)
sheet.Cells[i, 0].Value = dataTable.Rows[i][“CustomerId”];
sheet.Cells[i, 1].Value = dataTable.Rows[i][“Name”];
// more cells/columns are filled as above, removed for brevity…

// also need to do lookups outside of the data table as below:
sheet.Cells[i,7].Value = LookupDictionary[dataTable.Rows[i][“Status”]];

if (user.CanSeeDetail)
sheet.Cells[i, 8].Value = dataTable.Rows[i][“Price”];
// more

When I am using the loop, it takes more than 50+ seconds, slightly slower if i do not use the dictionary lookup, so it is not the one slowing things down. Is there any technique/trick I can do to make this run as fast as with ImportDataTable? Notice that I am not applying any styles, or doing anything besides setting cell values…


Well, Worksheet.ImportDataTable() is an efficient method that takes much lessor time to retrieve large data set a to be filled in the worksheet. When you perform manual insertion for bigger data set, it would take some time to fill each value into the cells from the data table fields.

Anyways, we have logged your request into our issue tracking system with an issue id: CELLSNET-16430. We will analyze if we can enhance the manual insertion process. We will get back to you soon.

Thank you.

Thanks, as a workaround, I clone the datatable and, in the cloned table, delete columns based on the user roles, and then add columns with look-up data, and do the ImportDataTable. This way one gets the benefits of the much faster Import in place of the ‘for loop’ loading of data rows. In my case, (3000 rows by 20 columns), the deference in speed is dramatic, less than a couple of seconds verses 40+ seconds!!!