February 23, 2012 at 8:42 pm
I'm looking for a good way to handle the where clause in a query that returns a user specified range of dates. So the table I'm querying has a datetime column (DateCreated), the sproc has an @FromDate and @ThroughDate that come from a UI. There are really four scenarios I need to cover in the where:
1.) Neither date is specified (return all dates)
2.) Only the FromDate is specified (return all records with a dateCreated >= @FromDate
3.) Only the Through date is specified (return all the records with a dateCreated <= @ThroughDate
4.) Both the FromDate and Through date are specified (return all the records with a dateCreated between the @FromDate and @ThroughDate
Is there a clever way to handle this, or is it all brute force? Brute force is going to make for an ugly where, but if that's what I have to do so be it.
where (@FromDate IS NOT NULL) OR (DateCreated >= @FromDate)
etc, etc, etc
Thanks!
.
February 23, 2012 at 9:20 pm
I believe the details in these articles will help, I don;t fully understand how they work but they work a treat.
http://www.sqlservercentral.com/articles/Performance+Tuning/howdynamicsqlcanbestatic2/684/
http://www.sqlservercentral.com/columnists/rmarda/howdynamiccanstaticbe.asp
February 23, 2012 at 10:03 pm
use BETWEEN...
select column where daterange between @startrange and @endrange
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
This thing is addressing problems that dont exist. Its solution-ism at its worst. We are dumbing down machines that are inherently superior. - Gilfoyle
February 24, 2012 at 2:18 am
This is a classic Catch-All query.
Gail's post here[/url] summarises the different options well.
The construct you've posted above could be very bad for performance prior to SQL 2008 SP2, but you can get an optimal plan for it now using the OPTION(RECOMPILE) hint to force it to produce a new plan for each execution (treating the parameters as static each time).
February 24, 2012 at 2:26 am
February 24, 2012 at 2:35 am
Artoo22 (2/24/2012)
SELECT*
FROM#t
WHEREDateCreated >= ISNULL(@FromDate, DateCreated)
ANDDateCreated <= ISNULL(@ThroughDate, DateCreated)
It might look neater, but the use of scalar functions (ISNULL) guarantees that indexes can't be used to fulfil the query.
February 24, 2012 at 6:05 am
SELECT @FromDate = ISNULL(@FromDate, SELECT MIN(DateCreated) FROM #t))
, @ThroughDate= ISNULL(@ThroughDate, SELECT MAX(DateCreated) FROM #t))
SELECT *
FROM#t
WHERE DateCreated BETWEEN @FromDate AND @ThroughDate
February 24, 2012 at 6:30 pm
Fantastic info, which is exactly what I knew I would get as always from this forum. Gails post is exactly what I needed.
Thanks all!!
.
February 25, 2012 at 12:38 pm
Henrico Bekker (2/23/2012)
use BETWEEN...select column where daterange between @startrange and @endrange
How's that going to work if either or both variables have a value of NULL because the user didn't specify them?
--Jeff Moden
Change is inevitable... Change for the better is not.
February 25, 2012 at 12:58 pm
HowardW (2/24/2012)
Artoo22 (2/24/2012)
SELECT*
FROM#t
WHEREDateCreated >= ISNULL(@FromDate, DateCreated)
ANDDateCreated <= ISNULL(@ThroughDate, DateCreated)
It might look neater, but the use of scalar functions (ISNULL) guarantees that indexes can't be used to fulfil the query.
The word "guarantees" may be a bit too strong in this case. Please see the following QOTD...
http://www.sqlservercentral.com/questions/Indexes/86916/
--Jeff Moden
Change is inevitable... Change for the better is not.
February 25, 2012 at 1:14 pm
BSavoie (2/24/2012)
Fantastic info, which is exactly what I knew I would get as always from this forum. Gails post is exactly what I needed.Thanks all!!
Considering the very narrow range of what the query you asked for is supposed to do, there's no need for dynamic SQL here at all. You know what the least and greatest dates that an SQL DATETIME column can have in it... use it to your advantage.
First, my general purpose million row test table. This only takes a couple of seconds to execute including the indexes...
/**********************************************************************************************************************
Purpose:
Create a voluminous test table with various types of highly randomized data.
--Jeff Moden
**********************************************************************************************************************/
--===== Do this test in a nice, safe place that everyone has.
USE tempdb
;
--===== Conditionally drop the test table to make reruns easier
IF OBJECT_ID('dbo.JBMTest','U') IS NOT NULL
DROP TABLE dbo.JBMTest
;
--===== Create and populate a 1,000,000 row test table.
-- "SomeID" has a range of 1 to 1,000,000 unique numbers
-- "SomeInt" has a range of 1 to 50,000 non-unique numbers
-- "SomeLetters2" has a range of "AA" to "ZZ" non-unique 2 character strings
-- "SomeMoney has a range of 0.00 to 100.00 non-unique numbers
-- "SomeDateTime" has a range of >=01/01/2000 and <01/01/2020 non-unique date/times
-- "SomeDate" has a range of >=01/01/2000 and <01/01/2020 non-unique "whole dates"
-- "SomeName" contains random characters at random lengths from 2 to 20 characters
SELECT TOP 1000000
SomeID = IDENTITY(INT,1,1),
SomeInt = ABS(CHECKSUM(NEWID())) % 50000 + 1,
SomeLetters2 = CHAR(ABS(CHECKSUM(NEWID())) % (ASCII('Z')-ASCII('A')+1) +ASCII('A'))
+ CHAR(ABS(CHECKSUM(NEWID())) % (ASCII('Z')-ASCII('A')+1) +ASCII('A')),
SomeMoney = CAST(RAND(CHECKSUM(NEWID())) * 100 AS DECIMAL(9,2)), --Note rounding
SomeDateTime = RAND(CHECKSUM(NEWID())) * DATEDIFF(dd,'2000','2020') + CAST('2000' AS DATETIME),
SomeDate = ABS (CHECKSUM(NEWID())) % DATEDIFF(dd,'2000','2020') + CAST('2000' AS DATETIME),
SomeName = RIGHT(NEWID(),ABS(CHECKSUM(NEWID())) % 19 + 2)
INTO dbo.JBMTest
FROM sys.all_columns ac1
CROSS JOIN sys.all_columns ac2
;
ALTER TABLE dbo.JBMTest
ADD CONSTRAINT PK_JBMTest PRIMARY KEY CLUSTERED (SomeID) WITH FILLFACTOR = 90
;
CREATE INDEX IX_SomeDateTime
ON dbo.JBMTest (SomeDateTime)
;
Even though the following query uses ISNULL in the WHERE clause, the ISNULLs do not contain any information from a table and are, therefor, SARGable. Running the following with the "Actual Execution Plan" on and see the Index Seek.
DECLARE @FromDate DATETIME,
@ThroughDate DATETIME
;
SELECT @FromDate = '20050101',
@ThroughDate = '20050102'
;
SELECT SomeDateTime,
SomeID
FROM dbo.JBMTest
WHERE SomeDateTime >= ISNULL(@FromDate, '17530101')
AND SomeDateTime <= ISNULL(@ThroughDate, '99991231')
;
--Jeff Moden
Change is inevitable... Change for the better is not.
February 25, 2012 at 2:28 pm
Thanks Jeff, they're just the sort of hints and tips I love getting.
February 25, 2012 at 4:39 pm
Actually I have many more search criteria. All are strings. Can I still take the simple approach or does that change things?
Thanks as always!
.
February 25, 2012 at 5:02 pm
Jeff Moden (2/25/2012)
Even though the following query uses ISNULL in the WHERE clause, the ISNULLs do not contain any information from a table and are, therefor, SARGable. Running the following with the "Actual Execution Plan" on and see the Index Seek.
Yes, it's a seek, however...
EstimateRows="90000", ActualRows="128"
There is no way (without a recompile) that the optimiser is going to be able to get anywhere close to a good estimate and without a good estimate you're prone to all sorts of 'bad' plans, bad because they are generated with a estimated row count that is no where close to reality.
A difference of almost 3 orders of magnitude on the row estimations is not something you want to see and is usually a strong indication that there's some problem with the query.
This is, in fact, one of the core problems of catch-all queries, that the row estimations that SQL generates based on one set of parameters is completely incorrect for a different set of parameters, leaving the query with no single optimal plan. That's why I generally recommend dynamic SQL or OPTION (RECOMPILE) (on SQL 2008 SP2 and above).
I wouldn't personally recommend this unless the query really is as simple as has been discussed here, a single table and a single predicate. More than that and the cost-based errors will just start to multiply and the optimiser will be adding in all sorts of operators designed for larger row counts (potentially including parallelism) that are just completely sub-optimal on smaller row counts.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
February 26, 2012 at 7:41 am
BSavoie (2/25/2012)
Actually I have many more search criteria. All are strings. Can I still take the simple approach or does that change things?Thanks as always!
If that's the case, I definitely recommend the dynamic SQL methods that Gail has in her fine blog post. Done correctly, as she demonstrated in her blog, it will be bullet-proof against SQL Injection and fast.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 15 (of 34 total)
You must be logged in to reply to this topic. Login to reply