November 11, 2009 at 3:49 pm
I had a similar problem with one of my queries, what i did is hard coded the date time values into the where clause and ran the query, then used variables in the where clause for the datetime values and ran the query.
I compared the 2 execution plans and found that when the datetime variables where passed, the query uses only the clustered index on the table, where as when i hard coded the dates it used a different non clustered index on the date field
I was able to minimize the run time of the query that used variables by forcing the query to use the non clustered index on the date field and it more than halved the query time.
try this and let me know how you go
November 11, 2009 at 5:39 pm
mpeters-878100 (11/11/2009)
I had a similar problem with one of my queries, what i did is hard coded the date time values into the where clause and ran the query, then used variables in the where clause for the datetime values and ran the query.I compared the 2 execution plans and found that when the datetime variables where passed, the query uses only the clustered index on the table, where as when i hard coded the dates it used a different non clustered index on the date field
I was able to minimize the run time of the query that used variables by forcing the query to use the non clustered index on the date field and it more than halved the query time.
try this and let me know how you go
Using index hints to solve performance issues is not the way to go unless it is a last resort. Proper indexes are the best way to go. Also, if you are using SQL Server 2005 and higher, you may also look at covering indexes using the INCLUDE clause when creating or altering an index if the number of columns needed for a query are small.
November 11, 2009 at 5:41 pm
Jim-720070 (11/2/2009)
Apologies for adding to what seems to be a resolved thread.. BUT! I was having a very, very similar issue.My sp looked something like this:
CREATE sp_blahblah @month INTEGER (YYYYMM) AS
BEGIN
--use the @month variable to work out the start and end month from a
--customer defined calendar table
DECLARE @start SMALLDATETIME
DECLARE @end SMALLDATETIME
SELECT col1,col2,col3
FROM tableA
INNER JOIN tableB ON a1 = b1
WHERE tableA.date BETWEEN @start AND @stop
END
Now when I executed the above the query would take about 30 seconds to run. However when I hardcoded in the dates, I was looking at a fraction of the time and a completely different execution plan.
The resolution for me was to manually recalculate the statistics on each table I was using. Obviously this is something that would be handled in a maintenance plan on a production server. But it worked for me! And I searched all over the web for help and ended up right here at home.. 😀
Seems a little incomplete, how are @start and @stop getting populated? What tare the indexes on the table?
November 11, 2009 at 6:12 pm
Jim, what would happen if you put the final select to another stored procedure and passed these retrieved dates as parameters to it?
create procedure innerProc(datetime @start, datetime @stop)
as
SELECT col1,col2,col3
FROM tableA
INNER JOIN tableB ON a1 = b1
WHERE tableA.date BETWEEN @start AND @stop
The thing is that if you use variables in queries the optimizer doesn't have a clue what their values can be at the compile time. It takes defensive approach, often resulting with index scans rather than seeks.
Regards
Piotr
...and your only reply is slàinte mhath
November 12, 2009 at 10:46 am
****EDIT******************************
skip this post, I was incorrect, as Lynn points out
*************************************
And try changing
WHERE tableA.date BETWEEN @start AND @stop
to:
WHERE tableA.date =< @stop AND tableA.date >= @start
looks more cumbersome, but performs better.
---------------------------------------------------------
How best to post your question[/url]
How to post performance problems[/url]
Tally Table:What it is and how it replaces a loop[/url]
"stewsterl 80804 (10/16/2009)I guess when you stop and try to understand the solution provided you not only learn, but save yourself some headaches when you need to make any slight changes."
November 12, 2009 at 11:13 am
jcrawf02 (11/12/2009)
And try changing
WHERE tableA.date BETWEEN @start AND @stop
to:
WHERE tableA.date =< @stop AND tableA.date >= @start
looks more cumbersome, but performs better.
Funny, but when I compare the execution plan of a similar query on my development server, the above return the exact same execution plan. How does the later perform better?
November 12, 2009 at 12:08 pm
D'oh! Thanks for the correction, I was incorrect. Edited my previous post to indicate that, but left it in place so that your quote made sense in the context of the thread.
Somehow I had convinced myself that this was true, some wierd mental combination of "don't use functions in the where clause" (Gail Shaw's Dirty Dozen[/url], which I was personally abusing in the form of dateadd() or datediff()) and "don't use BETWEEN on datetime comparisons" (from Jonathan Kehayias' post on How to Find Rows of Data Between Two Dates ).
I apologize for the confusion, and thank you for the opportunity to correct my misconception!
Jon
---------------------------------------------------------
How best to post your question[/url]
How to post performance problems[/url]
Tally Table:What it is and how it replaces a loop[/url]
"stewsterl 80804 (10/16/2009)I guess when you stop and try to understand the solution provided you not only learn, but save yourself some headaches when you need to make any slight changes."
November 13, 2009 at 7:26 am
The 'best' way to get the optimal plan for most everything covered in this thread is dynamic sql. Yep, you pay a small price for this in compilations and plan cache bloat, but you avoid the disasterous plans that can result from the 'imprecise' nature of widely-disparate-value-variable queries.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
Viewing 8 posts - 16 through 22 (of 22 total)
You must be logged in to reply to this topic. Login to reply