Is SSRS 2008 capable enough to be a solution for generating really huge reports? I have DOUBTS about it

  • Hi All,

    I have been doing performance analysis for SSRS 2008 for some really huge reports that return between 1 to 5 million records (with single user) and reports that return 100K to 200K records with multiple users. I have observed lot of issues which made me to think that may be SSRS 2008 is not good enough for such a huge reports even less good if there is a multiple users scenario for those big reports(though I wanted to use it). Please go through my points and share your views on this.

    Before putting up the points I would like to mention my configuration, Windows Server 2003 Enterprise Edition R2 SP2 64 bit (totally dedicated to SSRS reports as the data source for these reports was SQL Server 2005 Enterprise Edition), Microsoft Reporting Services 2008 R2 64 bit, 8 CPU, 8 GB RAM and the reports are simple tabular reports which are fetching the data from a single table (no joins)

    1. After executing reports ranging from 1 million records to 5 million records, for a single user at a time, a look at the ExecutionLog2 table shows that TimeDataRetrieval is not an issue as the maximum time I see is a minute which is for 5 million records report. The main contention is TimeProcessing which looks like this,

    i. 1m records report - 17 minutes

    ii. 2m records report – 38 minutes

    iii. 3m records report – 54 minutes

    iv. 4m records report – 59 minutes

    v. 5m records report – 71 minutes

    These values are pretty much consistent because I have executed these reports many times. I don’t know how many users would like to wait for such long time to see the reports.

    2. The reports mentioned above were executed one at a time and each report eats up memory ranging between 1GB to 4 GB depending on the report size (the memory usage data is from the WorkingSet perfmon counter for SSRS process which may not show the accurate memory usage but gives an idea). Given a multiple users scenario, say 100 users, I don’t know how much RAM should I put on the server to make it work? At any given time our business scenario expects 20 concurrent report users which will atleast need 64GB of RAM!! Not at all cost effective.

    3. During few executions for reports with 4m and 5m records, the ExecutionLog2 table entries were suspicious. After about an hour, I saw a browser error (PageRequestManagerExcpetion) and had no clue about how to relate this error with any possible issue pertaining to SSRS. A look into the log file shows that it was a timeout error. Surprisingly, when I saw the ExecutionLog2 table, it had a entry for this execution as successful report generation even when report was not rendered!! I observed this behavior atleast 3 or 4 times. Setting the appropriate timeout value resulted in correct report generation.

    4. Next round of testing was with reports having 100K records and concurrent users. This was the part that I am worried about most. In our business scenario, we have only few reports that might return upto 2m or 3m records. Most of the reports are between 100K to 200K records. I started with a 100K records report with 10 concurrent users and it worked well. Then I increased the users to 20 for the same report and it worked well too. When I increased the users to 30 I get an error. A look in the log file shows a warning that says ‘memory limit exceeds’ and then an exception ‘System.Threading.ThreadAbortException: Thread was being aborted’. I interpreted it as the SSRS has consumed the maximum memory assigned to it (5 GB through <WorkingSetMaximum> property) because I was also running the perfmon counters which showed me the WorkingSet for SSRS process reaching 5 GB and Memory Pressure counter reaching to 5. Though the error message in log file was not much explanatory. The point that bothers me here is that 100K records report with 20 or 30 concurrent users is a common scenario and it has eaten up all the memory upto the maximum limit.

    5. I repeated my experiment with 100K records report with 10 concurrent users which was successful earlier but failed this time. I saw this error message in the log file. ‘System.ArgumentOutOfRangeException: Index was out of range. Must be non-negative and less than the size of the collection.’ I do not have any idea what went wrong. I executed the same experiment once again and it succeeded!!

    6. I was not confident, so I tried again and executed the 100K records report with 10 concurrent users and this time it didn’t succeeded. Log file showed a very well known but vague error message, ‘System.NullreferenceException: Object reference is not set to an instance of the object’. I am puzzled that why sometimes the reports generation is successful and why sometimes it is failing? The error messages are also not consistent.

    7. Most of the replies to my queries on the SSRS forum on issues that I faced suggested not to create such huge reports, restrict the reports size etc., but I can't help it; there are few reports in our requirements that are going to be this big.

    If you guys feel that there is something wrong in the way I am doing these experiments, please make me correct, I will be happy. Our requirement is to have many reports ranging between 100K to 200K records, 2 reports with 2m or 3m records and 100 concurrent users. Need your suggestions to know if we should really go for SSRS 2008.

  • Sachin Vaidya (10/7/2010)


    I don’t know how many users would like to wait for such long time to see the reports.

    I don't know ANY users that would even consider reading or using a report like this! 100k rows is about 2000 pages when printed. No one is going to slog through this amount of data, ever. If you are talking about processing 100k rows into a report with fewer rows because of groupings or summaries then I would say you have some work to do on query optomization.

  • Daniel Bowlin (10/7/2010)


    Sachin Vaidya (10/7/2010)


    I don’t know how many users would like to wait for such long time to see the reports.

    I don't know ANY users that would even consider reading or using a report like this! 100k rows is about 2000 pages when printed. No one is going to slog through this amount of data, ever. If you are talking about processing 100k rows into a report with fewer rows because of groupings or summaries then I would say you have some work to do on query optomization.

    Hi Daniel,

    Thanks for your reply.

    Actually I know that no one is actually going to read such a huge report. My motive is here is to do the performance analysis of SSRS 2008 and to test its limit. We do have lot of reports where we are summarizing and grouping the data to render them in a presentable way. But unfortunately, there is a requirement to have atleast 2 such reports which needs to fetch around 2 to 3 million records all at a time. Right now we are doing this through C++ (front end GUI and highly optimized sql stored procedures)but we are planning to move the reporting to a reporting tool probably SSRS 2008.

    Sorry for the huge amount of text in my original post but I have mentioned there that there is no issue in data retrieval as even for a 5 million rows report, it takes maximum upto 1 minute. The main problem is the report processing time.

    Moreover, during my experiments (I am monitoring perfmon counters for memory, and SSRS perfmon counters that includes newly added ReportingService perfmon counters) I found that for a 100K report with as minimum as 10 concurrent users the workingset of reportingservice process shows the memory consumption upto 3GB which proves that SSRS is a memory intensive tool. So its hard for me to predict the maximum memory requirement for implementing SSRS to the business.

    Finally during my tests, I noticed inconsistency in report rendering (please refer my orginal post), sometimes I was able to generate the reports successfully, sometimes it was failing and throwing different excpetions every time and sometimes it was failing but executionlog2 table was showing an entry with succesfull rendering of the report!!

    P.S. I was the first to ask this question "Who would consider reading such a huge report?" before starting this experiment.

  • 7. Most of the replies to my queries on the SSRS forum on issues that I faced suggested not to create such huge reports, restrict the reports size etc., but I can't help it; there are few reports in our requirements that are going to be this big.

    At the risk of repeating what others may have said, I suggest you sit with the people who receive the report and ask them exactly what it is they're looking for. What you're doing now is akin to producing an entire phone book when all the users want is to look for a plumber or a mechanic or a lawyer.

    If they really want the whole phone book, keep digging for what they really want. If they continue to insist they want the whole thing, print it for them and then observe what information they look for as they read it; I guarantee they won't flip from page 1 to page 2000 in order. There has to be a way to reduce the number of records returned, if only by running the report in partitions.

  • @Doug

    Again, I completely agree to what you say, but what about the inconsistent errors? I will quickly summaries once again,

    - Sometimes report gets rendered properly

    - Sometimes report fails with a different exception everytime when I see the log.

    - Sometimes report fails but I find a successful entry in the ExecutionLog2 table in the report server database for that report.

    Regards.

  • My hunch is that from the actual Reporting Service's perspective, the work has been done -- that is, the report has been compiled and handed off to the Report Manager (the web server side of the process). However, the web server can't render the compiled report in time or runs out of resources to do it, hence the .Net exceptions. In most if not all cases, it looks like the web rendering is the bottleneck.

  • So, any pointers about where should I look to find the cause of these inconsistent rendering issue?

  • Daniel Bowlin (10/7/2010)


    Sachin Vaidya (10/7/2010)


    I don’t know how many users would like to wait for such long time to see the reports.

    I don't know ANY users that would even consider reading or using a report like this! 100k rows is about 2000 pages when printed. No one is going to slog through this amount of data, ever.

    Woah, Nelly! Now you've done it!

    I can't comment on the number of rows returned (lots!), because I haven't checked, but I HAVE recently generated a couple 3,000-5,000 odd page reports from RS and yes people did slog through them 🙂 They took about 10-20 minutes to run, and on the local PC I had to close a lot of things to make sure it didn't run out of memory part-way through.

    Why? We were generating a massive run of price lists for all of our customers into PDF format to be printed and mailed out (not everyone is on the email train as of yet!) In another case, we had a problem with the layout of a 3,000 strong invoice run, and I had to script something up to re-generate the whole lot for printing, separation, and mailing 🙂

    Sorry I can't comment on the errors you've mentioned, the only thing I've had are timeout and memory errors. Large report runs like this are infrequent enough that it's not a big problem. I don't have the specs of the server on hand, but it likely has 4gb ram.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply