Tuning Complex Procedures

  • Hello,

    I was wondering if anyone has written something like this.  Basically I am trying to tune some stored procedures and I am looking for a stored procedure or tool that would allow me to run a problematic stored procedure a specified number of iterations and then aggregate results like CPU, Reads, Writes, Etc to a table.  One of the main problems I'm having with these complex procedures is that I have to change the value of the parameters every time other wise on the second run doesn't change data.  That said, this tool or sproc would also need a way to dynamically change parameters each run.  Having the option to run multiple threads would be nice too.  Anyone heard of anything like that.  SQLQueryStress is close but doesn't change the params.  I doubt this exists but just wondering.  I suck at perf tuning.

    Also I am able to get the part of procedure that runs poorly but they usually rely on previous staging tables and variables being filled in earlier sections of the code so I can't run it by itself.

     

  • My approach to this is:

    1 - get it set up in a test environment

    2 - put the query into a transaction

    3 - roll back the transaction on completion.

    The above approach allows you to re-run the EXACT same code with minimal database changes (some changes still occur like identity values still increment for example).

    Then grab the execution plan and tune from that.  This is MUCH safer than running the stored procedure with different parameters.  It MAY be with parameter being 1 it is fast, but 100000000 causes slowness.  This would likely be a parameter sniffing problem or potentially a bad index.

    If I need to work with complex stored procedures, my approach is first - can I replicate it on test?  If I can, then tuning is easier as i don't impact production.  Execution plan will likely tell you what is "slow", so you know which part you should focus on.  I say "likely" because sometimes the "slow" part according to the execution plan turns out to be light on disk I/O or "fast" on the CPU time.

    That is my approach anyways.  I generally avoid running a stored procedure without rolling it back.  And if that stored procedure is taking out an exclusive lock, it could very well be that it cannot run in parallel and you MAY not be able to do much to tune that.

    Generally, if a query or stored procedure is ONLY slow when it is run in parallel, it is locking/blocking that is going to be your bottleneck.  So the solution is likely going to be to reduce the blocking (note - do not use the NOLOCK hint UNLESS you are fully aware of the risks and are OK with that... and even then I would avoid NOLOCK as there are ALMOST always better options).

    Just my 2 cents.  I so far have not come across a need for a stress test tool to tune my queries.  For example, if the stored procedure completes in 1/100 of a second and requires an exclusive lock, I can run the query 100 times per second.  If an end user is complaining because they need to wait less than 1 second for the query to complete, I may have to look at teaching my users patience.

    now to see reads and writes, you would execute:

    SET STATISTICS IO ON
    SET STATISTICS TIME ON

    First one gives you the IO per query, the second one gives you the time (CPU and elapsed) per query.

     

    The above is all just my opinion on what you should do. 
    As with all advice you find on a random internet forum - you shouldn't blindly follow it.  Always test on a test server to see if there is negative side effects before making changes to live!
    I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply