We're sorry Aspose doesn't work properply without JavaScript enabled.

Free Support Forum - aspose.com

Slow PST processing speed of Aspose.Email API

Hi team,

We have noticed that Aspose.Email API is quite slower as compared to a commercial API, IndependentSoft.de. Here are our comparative results for reading an OST and PST files using both the APIs. The times are in milliseconds.

Reading OST by Aspose API: 16595 ms
Reading OST by IndependentSoft: 3942ms

Reading PST by Aspose API: 161849 ms
Reading PST by IndependentSoft: 5468 ms

Do you people have any benchmark testing of your API against some standard or 3rd party APIs available? This issue is causing alot of delay in processing the PST files and we request you to please look into this matter soon for fixing.

Hi Cornelius,


Thank you for writing to Aspose support team.

I have analyzed the issue here and observed the difference in reading performance by Aspose.Email and IndependentSoft. This issue is logged under Id:EMAILNET-34960 in our issue tracking system for further investigation by the product team. I shall write here as soon as some feedback is received in this regard.

Hi Cornelius,


We would like to update you about the status of the issue logged as EMAILNET-34960. The slow performance arises due to the implementation structure of MapiMessage. The issue is known to us and we are currently working over introducing a more light weight alternative of MapiMessage, known as MessageObject.

Meanwhile, there are certain alternatives you can use to speed up the traversal of a PST/OST fie using the specific properties. You can use the PersonalStorage.SaveMessageToStream to save the message directly to file stream for saving. In addition, the PersonalStorage.ExtractProperty method can be used to extract specific property without extracting the item fully. You can also use the FolderInfo.EnumerateMessageObjects for saving the message which enumerates the messages as MessageObject and is a light weight class for .MSG files.

Please try these at your end and share your feedback with us.