Perfomance issue when handling large amount of data

  • Hi,

    We have developed a report which can pull upto 18 lakhs of rows. Of course, we have filters in the report to restrict the size of data, however, the business users are just curious to see the report pulling all the records with paging(30records per page).

    The report is taking totally around 15 minutes to render the data. Here, the query execution takes nearly 36 secnods and the remaining time is consumed for report processing.

    Can you please tell me the caching options or anything else that could improve the performance.

    Thanks

    Daniel

  • Are you suggesting that the users generate all 18 lakhs of rows 30 rows per page in a single report? I suppose there are a lot of options but I simply wouldn't allow that report to be generated... who is going to look through 1.8 million rows of data 30 rows at a time? Block the report.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply