Performance Hit When Using Variable Instead of Literals.

  • I have a query that does a Select from a VIEW consisting of 4 tables. Since this is PeopleSoft, and they never heard of Stored Procedures, the whole thing is being passed as a SQL string.

    I am trying to troubleshoot the performance in QA. I have noticed that when I run the query with literals in the criteria it runs in less than 1 second, e.g.

    Select field1, field2, field3, from VIEW where criteria = 'somestring'

    When I run this and plug in variables, it decides to use a different query plan and takes well over a minute to run the exact same thing, e.g.

    Declare @SomeString VarChar(15)

    Set @SomeString = 'somestring'

    Select field1, field2, field3, from VIEW where criteria = @SomeString

    Neither query, with the particluar criteria I am using, returns any rows.

    Note: this is a simplified query example. The actual query is much larger and uses several more criiteria, but I can use literals in the other 5 criteria and just change one to a variable to make this behavior occur. We have indexed the heck out of these tables, six ways from Sunday, and even SQL Index Tuning Wizard can't make any suggestions.

    Why would SQL change query plans based on the way criteria is specified?

    I can post more info if needed. I was just trying to keep it simple here.

    Thanks,

    Chris

  • I've got more or less the same problem some months ago.

    Check out the collation settings.

    If the server default collation differs from the database collation or from the column collation you might experience that kind of problem.

    Check out the access plan if you have a generated convert in the where clause



    Bye
    Gabor

  • I have noticed this phenomenon on several occasions. And in fact, it happens in other databases as well like Oracle and IBM DB2. Unfortunately, I have never found an answer. If Gabor's solution of collation settings works, then that's great, but other wise, I don't know what to do. I have seen this question posted before and I don't think anyone has a good answer (I know I replied with the same thing I am saying now).

  • I have heard form 2 sources now that this is typical for SQL Server, because it doesn't have the information that exists in the variable and cannot use statistics to help with the execution plan. This seems dumb to me, and I can't believe that MS wouldn't have thought about this when they designed the optimizer.

    It seem like they'd not optimize the query (create the execution plan) until they had assembled all of the possible elements, just before it asked SQL Server to execute it. This would make sense to me because the SQL Server MUST have all of the elements (criteria) before it submits the query to be executed.

    Chris

  • when you use variables the plan can vary a lot depending on the number of records returned. If for example the query is very selective (very few rows)  you may end up with a different plan that if it returns 5000 rows.

    The optimizer when you use literals, KNOWS before hand what are you willing  to return and can create a better plan

    One of the solutions I have for that problem is to invoque the query using sp_executesql with parameters and then because it knows the parameters it can recreate the plan if they vary a lot or reuse the plan it they don't  

      


    * Noel

  • I had one of the DBAs helping research this try it with sp_ExecuteSQL. The time it ran went from 47 seconds, or more, to less than 2. This is similar performance to when specifying literals in the criteria.

    This is an EXCELLENT development, except now I have to figure out how to get PeopleCode to execute a stored procedure.

    Now, where's my BIG hammer....

    Chris

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply