Using Parameters in a Query is Killing Performance

  • Ok, I give up, someone please help! We have a query that runs just fine with hard coded values but once we swap them out for parameters the query runs with no end in sight.

    This version consistently returns 8,429 rows in less than 2 seconds:

    SELECT username, firstname, lastname

    FROM Users

    WHERE UpdateDt BETWEEN '2/11/2016' AND '2/12/2016';

    This version will run well over an hours and not return data:

    DECLARE @Dt1 DATETIME, @dt2 DATETIME;

    SET @Dt1 = '2/11/2016';

    SET @dt2 = '2/12/2016';

    SELECT username, firstname, lastname

    FROM Users

    WHERE UpdateDt BETWEEN @Dt1 AND @dt2;

    The UpdateDt column in the Users table is DATETIME so I wouldn't expect any datatype conversion issues.

    Any theories on what could be causing this?

  • When you use variables in the predicate like that, SQL Server generates an execution plan with cardinality estimates based on the average number of rows per value across the entire table, using the density information in the relevant statistics object. This is effectively a query plan compiled for the "average" row count for any value you might pass in.

    When you use the literal values, SQL Server is able to base its estimates on those exact values, which often results in much more accurate estimates and more efficient execution plans.

    If you look at the actual execution plans for the two queries (although from the sound of it you won't be able to get the actual plan for the version with variables, since it runs so long), you'll likely see that in the version with the variables, the estimates for the row counts will be different than for the version with literal values, and probably significantly different from the actual row counts.

    This is a well-explored topic, and one quick look at it (of the many, many out there) can be found here, http://www.benjaminnevarez.com/2010/06/how-optimize-for-unknown-works/, where it is compared to the OPTIMIZE FOR UNKNOWN hint.

    Cheers!

  • Make the PK on UserID non-clustered and create a clustered index on UpdateDt.

    _____________
    Code for TallyGenerator

  • Sergiy (2/17/2016)


    Make the PK on UserID non-clustered and create a clustered index on UpdateDt.

    Unless UpdateDt is not unique. Then perhaps a clustered index on composite key consisting of UpdateDt and UserID (in that order).

    "I cant stress enough the importance of switching from a sequential files mindset to set-based thinking. After you make the switch, you can spend your time tuning and optimizing your queries instead of maintaining lengthy, poor-performing code."

    -- Itzik Ben-Gan 2001

  • The UpdateDt column in the Users table is DATETIME so I wouldn't expect any datatype conversion issues.

    Perhaps... To know for sure check the query plan (the actual query plan, not estimated); look for IMPLICIT_CONVERSIONs.

    Also look at the cardinality estimates for huge differences in estimated numbers of rows vs. actual number of rows. That could be a sign of parameter sniffing. You can also test the query with option(recompile) to see if it produces a better plan.

    Edit: used wrong IFCode for the quoted statement

    "I cant stress enough the importance of switching from a sequential files mindset to set-based thinking. After you make the switch, you can spend your time tuning and optimizing your queries instead of maintaining lengthy, poor-performing code."

    -- Itzik Ben-Gan 2001

  • Alan.B (2/17/2016)


    The UpdateDt column in the Users table is DATETIME so I wouldn't expect any datatype conversion issues.

    Perhaps... To know for sure check the query plan (the actual query plan, not estimated); look for IMPLICIT_CONVERSIONs.

    Also look at the cardinality estimates for huge differences in estimated numbers of rows vs. actual number of rows. That could be a sign of parameter sniffing. You can also test the query with option(recompile) to see if it produces a better plan.

    Edit: used wrong IFCode for the quoted statement

    Unless the OP posted different code than they're actually using, it won't be parameter sniffing, since those are local variables. It'll still likely just be an estimate mismatch, but due to the optimization for the "average case" when using variables in a predicate and not bad parameter sniffing 🙂

    Cheers!

  • Jacob Wilkins (2/17/2016)


    Alan.B (2/17/2016)


    The UpdateDt column in the Users table is DATETIME so I wouldn't expect any datatype conversion issues.

    Perhaps... To know for sure check the query plan (the actual query plan, not estimated); look for IMPLICIT_CONVERSIONs.

    Also look at the cardinality estimates for huge differences in estimated numbers of rows vs. actual number of rows. That could be a sign of parameter sniffing. You can also test the query with option(recompile) to see if it produces a better plan.

    Edit: used wrong IFCode for the quoted statement

    Unless the OP posted different code than they're actually using, it won't be parameter sniffing, since those are local variables. It'll still likely just be an estimate mismatch, but due to the optimization for the "average case" when using variables in a predicate and not bad parameter sniffing 🙂

    Cheers!

    We have a query that runs just fine with hard coded values but once we swap them out for parameters the query runs with no end in sight.

    "I cant stress enough the importance of switching from a sequential files mindset to set-based thinking. After you make the switch, you can spend your time tuning and optimizing your queries instead of maintaining lengthy, poor-performing code."

    -- Itzik Ben-Gan 2001

  • A lot of people mix up parameters and variables. The code posted has local variables, not parameters.

    As written, the cause is likely lack of parameter sniffing (variables can't be sniffed), but really need both execution plans to be sure.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Alan.B (2/17/2016)


    Jacob Wilkins (2/17/2016)


    Alan.B (2/17/2016)


    The UpdateDt column in the Users table is DATETIME so I wouldn't expect any datatype conversion issues.

    Perhaps... To know for sure check the query plan (the actual query plan, not estimated); look for IMPLICIT_CONVERSIONs.

    Also look at the cardinality estimates for huge differences in estimated numbers of rows vs. actual number of rows. That could be a sign of parameter sniffing. You can also test the query with option(recompile) to see if it produces a better plan.

    Edit: used wrong IFCode for the quoted statement

    Unless the OP posted different code than they're actually using, it won't be parameter sniffing, since those are local variables. It'll still likely just be an estimate mismatch, but due to the optimization for the "average case" when using variables in a predicate and not bad parameter sniffing 🙂

    Cheers!

    We have a query that runs just fine with hard coded values but once we swap them out for parameters the query runs with no end in sight.

    Sure, but then actual code was given. Either the code presented is the code they're using and the terminology is just being used loosely, or the terminology is being used precisely and the code they presented is not the code they're running.

    I find it more likely that the terminology is being used loosely, but you may be right. We'll just have to wait and see 🙂

    Cheers!

  • Try the RECOMPILE option. That time will be trivial compared to I/O time.

    Also, do review the indexes on the table: getting the best clustered index on every table is critical for best performance.

    DECLARE @Dt1 datetime, @dt2 datetime;

    SET @Dt1 = '20160211';

    SET @dt2 = '20160212';

    SELECT username, firstname, lastname

    FROM Users

    WHERE UpdateDt >= @Dt1 AND UpdateDt < @dt2

    OPTION( RECOMPILE );

    SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".

  • It's what Gail said. Those are local variables. They're not parameters. Local variables are not sniffed the way parameters are except in the case of a statement level recompile. The suggestions to use a recompile hint may help. If the performance is good with hard coded values, it's entirely likely that you need parameter sniffing to assist your performance. Parameter sniffing is, in most cases, a good thing. You need to look at the execution plans with the local variables and with the hard coded values to understand what the differences are.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Is the table partitioned?

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato

  • Loner (2/17/2016)


    Ok, I give up, someone please help! We have a query that runs just fine with hard coded values but once we swap them out for parameters the query runs with no end in sight.

    This version consistently returns 8,429 rows in less than 2 seconds:

    SELECT username, firstname, lastname

    FROM Users

    WHERE UpdateDt BETWEEN '2/11/2016' AND '2/12/2016';

    This version will run well over an hours and not return data:

    DECLARE @Dt1 DATETIME, @dt2 DATETIME;

    SET @Dt1 = '2/11/2016';

    SET @dt2 = '2/12/2016';

    SELECT username, firstname, lastname

    FROM Users

    WHERE UpdateDt BETWEEN @Dt1 AND @dt2;

    The UpdateDt column in the Users table is DATETIME so I wouldn't expect any datatype conversion issues.

    Any theories on what could be causing this?

    So what's the CREATE TABLE for this table look like, what are the indexes on it, and how many rows does it have?

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Alan.B (2/17/2016)


    Sergiy (2/17/2016)


    Make the PK on UserID non-clustered and create a clustered index on UpdateDt.

    Unless UpdateDt is not unique. Then perhaps a clustered index on composite key consisting of UpdateDt and UserID (in that order).

    Clustered index does not have to be unique.

    Which does not mean I would object your suggestion.

    _____________
    Code for TallyGenerator

  • The table is not partition and there is no index on date.

Viewing 15 posts - 1 through 15 (of 58 total)

You must be logged in to reply to this topic. Login to reply