Tuning a query

  • Guys,

    I am stuck with a task here. there is a procedure which gets called from a report but the procedure takes a lot of time. I am posting a code here for a select statement. There are 6 similar blocks of statements in the procedure so tuning one to the core can help others as rest are on similar terms. I have added indexes as per the requirement and do not find a scan anywhere for my billion row table. But still, it takes 5-6 seconds for each block to run, so the procedure takes around 25-30 seconds to run and in busy hours causes time outs. Please find the query, schema details and execution plan and suggest some remedy or further scope of work.

    Regards

    Chandan

  • Can you post the full sp? Maybe the biggest optimisation opportunity is to merge those statements.

  • well, i'm pretty sure statistics need to be updated; we've got a couple of orders of magnitude off on the estimated vs actual on two of the index seeks:

    i don't see anything that would get addresses by an extra index though, but my skills are admittedly limited.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • I was waiting a little bit to go there lowell (6 B rows) and distinct <seems to> always screws up the estimates. It might not be the cases if you run this on unique constraints but I've never tested that.

  • ok i see there is a RID Lookup on the Company Table, which is a Heap, that seems to be something to address, I'm going to sit back and learn a bit more on this one; thanks for the tidbit about distinct messing with estimated rows....

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Update statistics on Products and Transaction with full scan. The estimations are off a bit. See if that changes the plans at all.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Guys. I posted the plan after I updated the statistics for the whole database.

  • Did you update with fullscan, if not by what %?

  • .

  • Ninja's_RGR'us (8/9/2011)


    Did you update with fullscan, if not by what %?

    I just ran:

    exec sp_updatestats.

    Do i need to modify this statement. please advice.

    Just FYI. Following are the number of rows:

    Company: 9214

    Transactions: 31509814

    Products: 31589654

  • I'd still like to see th whole SP.

    In the mean time you could experiment with this >

    http://sqlinthewild.co.za/index.php/2009/03/19/catch-all-queries

    You seem to be fine here, but I would also change to inner joins. It makes the query easier to understand and it avoids accidental cross-joins.

    I think it may also be why there's a distinct in there (almost there by default by anyone who uses that syntax regularly). Can you confirm that the distinct is actually required for that query? Just removing that might prove a big help.

    After all that we can play with splitting this query into steps or maybe covering indexes...

  • chandan_jha18 (8/9/2011)


    Ninja's_RGR'us (8/9/2011)


    Did you update with fullscan, if not by what %?

    I just ran:

    exec sp_updatestats.

    Do i need to modify this statement. please advice.

    Just FYI. Following are the number of rows:

    Company: 9214

    Transactions: 31509814

    Products: 31589654

    That uses the default or last sampling rate. Please run this to make 100% sure you have good stats >

    UPDATE STATISTICS dbo.table WITH FULLSCAN

  • chandan_jha18 (8/9/2011)


    Ninja's_RGR'us (8/9/2011)


    Did you update with fullscan, if not by what %?

    I just ran:

    exec sp_updatestats.

    In other words, you did a sampled update.

    Please do an update statistics with full scan on those tables and see if the plan changes.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • GilaMonster (8/9/2011)


    chandan_jha18 (8/9/2011)


    Ninja's_RGR'us (8/9/2011)


    Did you update with fullscan, if not by what %?

    I just ran:

    exec sp_updatestats.

    In other words, you did a sampled update.

    Please do an update statistics with full scan on those tables and see if the plan changes.

    Did you actually find anything problematic in the plan which I attached.

  • Yes, stats seem stale / wrong.

    The row estimate is 1 and the actual count is over 100K iirc. That's why we're all pounding on this one.

Viewing 15 posts - 1 through 15 (of 43 total)

You must be logged in to reply to this topic. Login to reply