speed concerns for a large report

  • Hi there,

    I recently received advice on a large complex financial report which was to build a cube for the datasource. This apparently makes running the report a lot faster.

    Initially what I had in mind was to have the report run from one table. A Job will run a procedure every 30 minutes and pull data from various tables, format the data it and put it into one, reporting, table. This is so I can make the report as fast as possible.

    In this case, would a cube still be a viable option? or is there something else I could do with the data in the reporting table to speed it up even more? Perhaps pre-aggreting some of the data when I add the data to the table?

    Thanks

  • I believe that pre-aggregation would be one of the keys to speed here. Since you're running it as a job every 30 minutes, my suggestion would be to try to avoid doing it all in one query. Divide'n'Conquer just as you would in any programming language. Pre-aggregation is a part of that. Peel one potato at a time in your code and you'll be amazed at how fast you can get it all to run.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Thanks for the response Jeff.

    I take it when you mean not doing it all in one, but do it in batches within one job?

    Edit:

    Actually, about pre-aggregating, I'm a little confused with this one. I obviously wan't to cut the amount of processing time the report takes to run by pre-calculating etc and then inserting the results to a reporting table. As far as pre-aggregating, I'd want to do most of the grouping calculations here that the report would normally do (SUM etc).

    However, I'm using parameters so obviously the results in the report will be different for different people. Is it possible to still pre-aggregate data in this way?

    Thanks

  • Yes... multiple queries in one batch job. Sometimes you can get away with just one query... sometimes not.

    At this point, I'd recommend that you post your query, the table defs, and the execution plan per the article at the second link in my signature below. There's a bunch of talented people on this forum who can and will help if you give them the correct information without them having to ask a bunch of unnecessary questions. Check the article out and post back...

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Thanks Jeff

    Yeah, that's fair enough... Once I've got enough info and still believe I need help I'll post all the info necessary to receive help.

    Cheers

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply