how to speed up update statistics full scan

  • hi,

    i have a 20G table which takes 40 minutes to update statistics with full scan.

    i want to perform update statistics with full scan on a 350 GB table. whats the best way to make this run faster? based on the 20gb table i can say it will take 11 hours for the 350 gb table. but i dont have such a long maintenance window. is there anyway to split the update statistics with full scan?

    we are sql server 2008 standard

    thanks,

  • Hi there - I don't think there's a way of optimizing the UPDATE STATISTICS function, however database statistics should be updated automatically by default therefore any statistics which would benefit the processing of queries should be picked up through the auto update feature.

    What benefits do you gain from manually updating the statistics on the table in question?

    Hope this helps.

    --------------------------------------------

    Laughing in the face of contention...

  • Ah, there could be quite a bit of a reason to manually handle the statistics updates! Here's a Microsoft support article on the issue:

    Article

    Under the "Determining when AutoStat generation is imminent" heading, there's a listing for thresholds for statistics to auto-update. For cases when tables have 500 or more rows, the threshold is when 500 rows + 20% of the table's data has been changed. For a 350GB table, that could take quite some time, and lead to rather stale stats in the meantime!

    However, as far as speeding up the process goes, I don't believe there's a direct route other than adding better hardware. On the other hand, there's a slightly indirect route; if your table has multiple statistics on it (which is quite possible), you could break the statistics updates into one or two a day, and adding more to your maintenance window as you find you have the time for them.

    This way, you end up with stats that will still be slightly stale at the time of the update, but being able to do some maintenance a little late is far better than doing none at all. Be sure to test with a representative workload on a separate server, though; figuring out how much you can do with your maintenance window will be the first priority!

    - 😀

  • rajsin7786 (8/6/2014)


    hi,

    i have a 20G table which takes 40 minutes to update statistics with full scan.

    i want to perform update statistics with full scan on a 350 GB table. whats the best way to make this run faster? based on the 20gb table i can say it will take 11 hours for the 350 gb table. but i dont have such a long maintenance window. is there anyway to split the update statistics with full scan?

    we are sql server 2008 standard

    thanks,

    What is the purpose of the table? Is it an "audit" table that is only written to and never updated?

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • arrjay (8/6/2014)


    What benefits do you gain from manually updating the statistics on the table in question?

    The auto update uses a small sample rate which may not give you the best statistics. Manual statistics maintenance allows you to specify a sample rate.

    The SQL Guy @ blogspot[/url]

    @SeanPearceSQL

    About Me[/url]

  • rajsin7786 (8/6/2014)


    hi,

    i have a 20G table which takes 40 minutes to update statistics with full scan.

    i want to perform update statistics with full scan on a 350 GB table. whats the best way to make this run faster? based on the 20gb table i can say it will take 11 hours for the 350 gb table. but i dont have such a long maintenance window. is there anyway to split the update statistics with full scan?

    we are sql server 2008 standard

    thanks,

    I would suggest finding a sweet spot between a sample rate big enough and the time it takes to build.

    The SQL Guy @ blogspot[/url]

    @SeanPearceSQL

    About Me[/url]

  • I just have to ask are you physical or virtual ? How much RAM and how many CPUs ? What does your storage look like ?

    40 minutes to update statistics (even with a full scan) on a 20 GB table is horrendously slow.

    Something as small as 20 GB should be way, way less - like maybe 2-3 minutes tops.

    As a point in fact we always update statistics with full scan - some of our larger tables range from 100 GB to 400 GB.

    Even those large, and very wide tables, complete as fast or faster than your 20 GB table.

    RegardsRudy KomacsarSenior Database Administrator"Ave Caesar! - Morituri te salutamus."

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply