Query Analyzer behaves differently

  • I have seen this several times now, and can't figure out what is happening.

    We see a stored proc, that we believe is a poor performer in our production DB (usually using profiler). We take the SP, using QA, and spend time analyzing it to modify/optimize it. Then we move the new improved version into production - sometimes with what we believe to be many, many times faster and more efficient. Then we find out it performs NOTHING like it did in QA. From time to time we find things like places in the code where casting is being done, or a particular param isn't the most efficient. But most of the time, there just doesn't seem to be any rhyme or reason.

    Can anyone give me some ideas for difference in behavior, between QA and regular use in production ? Or point me toward some reading ? Searching hasn't paid off for me yet.

    Thanks

  • Sure... when you're using QA, you may run the same thing more than once. Data is cached and everything looks rosy because of it.

    If you really want to see how something is going to perform, include the following in your testing in QA...

    DBCC FREEPROCCACHE

    DBCC DROPCLEANBUFFERS

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • The first thing that you should do is to compare the QA query plan to your Production system's query plan.

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • Maybe I've not had enough coffee... or maybe just too much scotch.

    How do you check the production execution plan without getting QA involved, Barry?

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff - I am using the DBCC commands. Our initial work and testing is on a separate machine so we can do it that way. But then we move over to production, and run it (both just as raw code in QA, and calling the SP) Thanks.

    Barry. Thanks for the response. I've got to second Jeff's question. If I'm using QA on the production machine to look at the execution plan of the raw code in the SP, will there be a different plan when the sp is called ?

    jim in FL

  • Heh. I thought that you were comparing QA to production, as in QA="Quality Assurance" server vs. your Production server. I forgot that here QA="Query Analyzer".

    I think that the same advice still applies though, I always check the query plans in QA (Query Analyzer) before I release them to prodcution. When it is running in production you can use SQL Profiler to capture the query plan (I *think* that this works in SQL 2000 also, but I cannot check it right now).

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • OK, I was able to find one last SQL 2000 server that I can access from here (being shut down at the end of the month). You can get the execution plan out of the Profiler be choosing the Performance..Execution_Plan event and making sure that the TextData data column is selected.

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • When you are modifying the code in QA, are you using the guts of the stored procedure and executing it as a batch? I often do this. You could be getting hit by parameter sniffing. Perhaps try a WITH RECOMPILE on the sproc to keep this from happening. This is most problematic when different input parameters can result in widely varying rows affected/joined/output. A cached nested loop plan that results in hitting 5M rows on a join will be a total dog!!!

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

  • I had this situation before and it was caused by parameter sniffing.

    Imagine a table of everybody in the world:

    select *

    from world

    where country = 'Belgium'

    Assuming an (non clustered) index on country, this query would probably use the index to retrieve the data. Consider this query:

    select *

    from world

    where country = 'China'

    As the query will return approx 20% of the table SQL Server will probably use a table scan to return the data.

    In a stored procedure, the query plan is cached and I ended up with the second query using the same query plan as the first - the result was a slow running procedure in Live but fine on the Dev and Test boxes.

    We solved this by adding WITH RECOMPILE to the procedure and using local procedure parameters in the queries rather than the parameters passed into the proc.

    create procedure [get_everybody] @country varchar (50)

    WITH RECOMPILE

    AS

    declare @pr_country varchar (50)

    set @pr_country = @country

    select *

    from work

    where country = @pr_country

    As we are running a warehousing system the overhead of recompiling the proc everytime is not a concern. Not sure about this in a high volume transactional system.

    Not sure whether your situation is similar but is something to think about.

    Jez

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply