Parameterised Stored Procedure Execution Plan Problem

  • I have a query that is executed in a stored procedure with a WHERE clause as follows:

    WHERE (cm.contact_id = @contact_id OR @contact_id = 0)

    AND (a.branch_id= @branch_id OR @branch_id = 0)

    The procedure will only ever be run with either a branch_id or a contact_id (the other id will be 0).  If I run this procedure for a branch then a contact it takes about 10 seconds because the query engine is doing a full table scan before applying the filter the second time round.  E.g.

    EXEC prc @branch_id = 100, @contact_id = 0

    EXEC prc @branch_id = 0, @contact_id = 100

    If I run the SELECT statement twice and substitute values for the branch_id and contact_id it takes less than a second to return both result sets and you can see from the execution plan there is no table scan. 

    Is there any way to change the behaviour of the stored procedure so that it doesn't cache a useless execution plan? 

    By the way, if I run two identical SELECT statements, each with a different WHERE clause, one for branch, one for contact, it runs in under 1s again but I don't want to have multiple identical SELECT statements because of maintenance (especially when you need to introduce more parameters).  This didn't happen in SQL 2k, I believe it has something to do with the new way 2005 caches individual statements rather than the whole procedure.

    Recompiling the stored procedure after each call also works but then what is the point in having a precompiled execution plan?

    Any help greatly appreciated.

     

  • is there a reason you don't specify, instead,

    WHERE (cm.contact_id = @contact_id) OR (a.branch_id = @branch_id)

    and not care if both variables = 0?

    See if that speeds up your execution?

  • Sounds like a typical parameter sniffing problem.

    Let's say the first time SQL comes to execute this, the value for @Branch is 100 and @contact is 0. SQL optimised the proc based on those params and decides that the best execution plan used a NC index on branch with a lookup back to the cluster, then it will filter on contact later.

    This execution plan is then saved and used for that query every time it runs, regardless of parameter changes.

    The next time the proc runs, the cached exec plan is retrieved and executed, however the parameters have changed and now the plan that was optimal is not. SQL will, however, not recompile it, and you'll be stuck with the bad plan until it's thrown out of cache.

    My recomendation would be to split into two queries so that they can each get optimal execution paths. If you don't want to do that, add the clause OPTION (RECOMPILE) to the end of that query so that each time it runs that individual query can be recompiled, as it has no single optimal execution plan.

    btw, SQL 2000 did ORs very badly, so was probably coming up with a plan that was sub-optimal for both

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Thank you for the information.  I think I will go for the multiple SELECT statements, one for each parameter rather than the recompile but the OPTION (RECOMPILE) will come in handy where there are a lot more parameters or when cached execution plans aren't offering a big improvement.

    Re. the other post, your suggestion wouldn't work unfortunately because there are null values in both ID columns and rows with null Ids need to be returned when the id is set to 0.

  • you can do (@a = 0 and a is null) to capture the case where input is 0 and actual value is null

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

  • One other method is declaring local variables, assigning them values from the parameters and then substituting the parameters with the local variables in the WHERE clause.  This speeds up the procedure significantly (~1s rather than ~10s) but in my scenario the separate SELECT statements (one for each parameter) is still about 30 times faster so this is what I have had to opt for. 

  • The local variables work because the query optimiser can't sniff their values when it compiles the proc, hence it will generate an average-case plan, optimised for no particular values

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply