We're sorry Aspose doesn't work properply without JavaScript enabled.

Free Support Forum - aspose.com

LINQ Reporting Engine fails at foreach on empty JSON arrays

Issue:
The Reporting Engine raises an exception when an empty JSON array is passed to a foreach statement, i.e. “An error has been encountered at the end of expression ‘row.val]>’. Can not get the value of member ‘val’ on type 'System.Data.DataRow”

What I expected:
An exception should not be thrown. The foreach loop should just not execute and the contained contents removed from the document.

To reproduce the issue:

JSON file emptyarr.json

{ "key": 1,
  "arr": [] }

Template document emptyarr.docx

<<foreach[row in data.arr]>><<[row.val]>>
<</foreach>>

C# code

var doc = new Document("emptyarr.docx");
var json = new JsonDataSource("emptyarr.json");
var rb = new ReportingEngine();
rb.BuildReport(doc, json, "data");

@p.maier

LINQ Reporting Engine parses a template document as a whole before evaluating any expression. The engine does not skip parsing of expressions, which are not going to be evaluated according to template logic and data. In this case, there is no member val defined for arr items in your JSON file, hence the parsing error appears, which is an expected behavior rather than a bug.

However, there is the ReportBuildOptions.AllowMissingMembers option designed for scenarios exactly like yours. You can apply the option as follows:

ReportingEngine engine = new ReportingEngine();
engine.Options |= ReportBuildOptions.AllowMissingMembers;
engine.BuildReport(...);

Please see Accessing Missing Members of Data Objects for more information.

Thank you, while the proposed workaround seems to do its job, it makes it very difficult to find mistakes within template expression. Are there any plans to move away from static type checking in the reporting engine because it doesn’t seem to fit to how data sources such as JSON and XML work?
Thanks.

@p.maier

The feature request was already filed as WORDSNET-12148. At the moment, it is postponed and there is no ETA available. You will be notified through the forum thread once there is any update on this.

Thank you. Would it be possible to adapt the reporting engine to delay the type checking of data members unless they are actually used in e.g. a loop body? Or do the type checking but only raise the error if the loop body is executed. Are there any options for us to incentivize a modification like that? Thanks.

The current behavior is pretty bad, to be honest. The engine should not break just because a collection (or JSON array for that matter) is empty.

@p.maier

LINQ Reporting Engine supports a subset of C# language in its template syntax and mimics C# compiler behavior to some extent. For example, parsing of template syntax corresponds to compile time. On attempt to use a missing member of an object in code, a C# compiler throws an error during compile time, so does the engine. This is exactly the same behavior.

The inconvenience you mentioned comes from the JSON notation itself - it does not allow to specify which properties an item of an empty array should have. Hence, the engine cannot obtain this information and raises an alarm on attempt to use any member of such an item as the member is missing in a JSON file. For example, for a regular empty array coming from a .NET object, this situation would not appear, because type information would be still present.

To make the engine continue as usual (i.e. do not break) when encountering a reference to a missing data member, you can apply the ReportBuildOptions.AllowMissingMembers option as I already shared.

If you worry about that some typos in template expressions can be never found then, I would suggest to use the following approach: Do not use the ReportBuildOptions.AllowMissingMembers option (as well as empty JSON arrays) while designing and testing a template and do use the option while building an actual report.

WORDSNET-12148 is logged to support working with dynamic objects. Please check our paid support policies to find a suitable option to raise its priority. However, I am afraid that the feature is very laborious, so raising its priority will not significantly boost its implementation.