I’m calling Worksheet.AutoFitRows() on some worksheets that can have a very large number of rows (over 100,000). I know that this is a complex operation and will likely take a while to complete.

However, I’m seeing performance even worse than I expect. When a call AutoFitRows() on worksheets of different sizes, I see times like this:

Rows | Seconds |

1000 | 1.579 |

2000 | 3.5 |

2500 | 4.9 |

3000 | 6.57 |

4000 | 11.6755 |

5000 | 23.02 |

6000 | 40.89 |

7000 | 62.37 |

8000 | 84.24 |

If you graph this, you'll see that the time is increasing exponentially (2 ^ n) with the number of rows. I would expect the time to scale linearly with the number of rows.

Here's another way of looking at this. I timed how long it took to call AutoFitRows on the first 1000 rows of my worksheet (rows 0 - 999) - That takes 7 seconds. I then timed how long it took to call AutoFitRows on the last 1000: (rows 113417 - 114416). It took 29 seconds. My data is quite similar throughout the worksheet, so I would expect the time to be about the same no matter which 1000 rows I AutoFit. Instead, I see that the farther down the worksheet, the longer it takes to AutoFit.

Also, if I skip the AutoFitting in Aspose.Cells and instead load the file into Excel 2010, then I can AutoFit all 114417 rows in about 4 seconds. Aspose.Cells will take about 30 minutes.

Can you explain why AutoFitRows is being so slow, not scaling linearly, and taking 1000's of times longer than Excel? Are there some optimizations I can make to make this scale better?

Thanks.