We have a report that I think is fairly complex. It is not that big, the final version can be about 2 megs. It does have a lot of merge if fields, about 3711 including table, of those 3711, 2143 are merge if fields. The table sections are tables within table, i.e. subtables. We are also using html formatting, table formatting(removing of empty rows), inserting of a image in the header of each document. The report will grow or shrink depending on what data we load in to it. Small record sets will finish very quickly while very large record sets will take 45 seconds to a minute. If we run 3 large reports simultaneously the CPU will go over a 100% and then the transactions will time out. I think I read this somewhere, that the entire dom is rebuilt after a merge IF field is updated, is this true?
My plan is to help fix this is to
- Remove the IF fields and move that logic into the java code, each field that was a “if” will now have a unique field name in the doc
- Break the document up from 1 large document to many smaller documents
- Thread the creation of the smaller documents once those are finished they would then be merged into 1 large documentWe have a report that I think is fairly complex. It is not that big, the final version can be about 2 megs. It does have a lot of merge if fields, about 3711 including table, of those 3711, 2143 are merge if fields. The table sections are tables within table, i.e. subtables. We are also using html formatting, table formatting(removing of empty rows), inserting of a image in the header of each document. The report will grow or shrink depending on what data we load in to it. Small record sets will finish very quickly while very large record sets will take 45 seconds to a minute. If we run 3 large reports simultaneously the CPU will go over a 100% and then the transactions will time out.
Do you have any other advice or best practices that might help?