Hi guys,
Setup:
- We use dot/dotx files as template (all configured with mail merge syntax).
- We use dataset containing multiple tables as data source. We are trying to enable linq reporting.
We have a logic that looks at the merge field list in the document template, then conditionally load data depending on what merge fields used. This is to reduce amount of data we load from database for exporting.
For example, we have source code something like below
var tableFields = doc.MailMerge.GetFieldNames()?.Where(x => x.StartsWith("TableStart:");
var ds = new DataSet();
if(tableFields.Contain("TableStart:DataGroup1")
{
ds.Tables.Add(LoadDataGroup1());
}
Obviously, this trick works for documents configured with mail merge. However, the linq reporting does not have something like MailMerge collection that has list of fields used in the document. Only possible option I could think of is parsing text content like below
var docCotent = doc.GetText();
// snt as namespace
if(docConent.IndexOf("snt.DataGroup1", StringComparison.OrdinalIgnoreCase) >= 0)
{
ds.Tables.Add(LoadDataGroup1());
}
Is this only way to archive conditionally loading data? Could you recommend a better approach if there’s any?
Thanks
@sentientsoft
To conditionally load data for LINQ reporting based on the merge fields used in your document template, you can utilize the GetFieldNames
method to retrieve the list of fields. However, as you noted, LINQ reporting does not provide a direct equivalent to the MailMerge collection for this purpose.
Your current approach of parsing the document’s text content to check for specific field names is a valid workaround. However, there are more structured ways to achieve this without relying solely on string parsing.
Recommended Approach
-
Use a Custom Data Source: Create a custom data source that can dynamically provide data based on the fields present in the document. This way, you can encapsulate the logic for loading data within the data source itself.
-
Implement a Data Loading Logic: In your custom data source, implement logic that checks for the presence of specific fields and loads the corresponding data. This can be done by overriding methods that are called during the report generation process.
Here’s a simplified example of how you might implement this:
public class CustomDataSource : IMailMergeDataSource
{
private readonly IEnumerator _enumerator;
private readonly Dictionary<string, Func<DataTable>> _dataLoaders;
public CustomDataSource(IEnumerable data)
{
_enumerator = data.GetEnumerator();
_dataLoaders = new Dictionary<string, Func<DataTable>>
{
{ "snt.DataGroup1", LoadDataGroup1 },
{ "snt.DataGroup2", LoadDataGroup2 }
// Add more mappings as needed
};
}
public bool MoveNext() => _enumerator.MoveNext();
public bool GetValue(string fieldName, out object fieldValue)
{
if (_dataLoaders.ContainsKey(fieldName))
{
fieldValue = _dataLoaders[fieldName]();
return true;
}
fieldValue = null;
return false;
}
private DataTable LoadDataGroup1()
{
// Load and return DataGroup1
}
private DataTable LoadDataGroup2()
{
// Load and return DataGroup2
}
}
Usage
When building your report, you can instantiate this custom data source and pass it to the reporting engine. The engine will call GetValue
for each field, allowing you to conditionally load data based on the fields present in the document.
This approach keeps your data loading logic encapsulated and avoids the need for string parsing, making your code cleaner and more maintainable.
Conclusion
While your current method works, using a custom data source with conditional loading logic is a more robust solution. This allows you to manage data loading in a structured way, improving the maintainability of your code.
If you have further questions or need more examples, feel free to ask!
@sentientsoft
Please check this comment for a basic idea on how to collect expressions used in a template for LINQ Reporting Engine. If needed, the approach can be extended to consider foreach
tags specifically.
Thanks Ivan,
That post seems more applicable approach for us.
1 Like