April 28, 2009 at 8:16 am
Kurt W. Zimmerman (4/28/2009)
My approach is quite simple. Often times I am passing in a date range into a sproc for selection criteria. What I do is the following
set @SubmitDateFrom = cast(convert(varchar(20), @SubmitDateFrom, 101) + ' 00:00:00' as datetime)
set @SubmitDateTo = cast(convert(varchar(20), @SubmitDateTo, 101) + ' 23:59:59' as datetime)
Then I go ahead and do a simple comparison in my WHERE clause:
WHERE p.SubmitDate BETWEEN @SubmitDateFrom AND @SubmitDateTo
[/CODE]
I've established this as a standard development practice and seems to work just fine.
Kurt
This will work if p.SubmitDate is declared as a SMALLDATETIME, but if it is declared as a DATETIME you will miss any records with a time portion > today 23:59:59 and < tomorrow 00:00:00. It really isn't the best way to filter a date range.
April 28, 2009 at 8:20 am
Kurt W. Zimmerman (4/28/2009)
My approach is quite simple. Often times I am passing in a date range into a sproc for selection criteria. What I do is the following
set @SubmitDateFrom = cast(convert(varchar(20), @SubmitDateFrom, 101) + ' 00:00:00' as datetime)
set @SubmitDateTo = cast(convert(varchar(20), @SubmitDateTo, 101) + ' 23:59:59' as datetime)
Then I go ahead and do a simple comparison in my WHERE clause:
WHERE p.SubmitDate BETWEEN @SubmitDateFrom AND @SubmitDateTo
[/CODE]
I've established this as a standard development practice and seems to work just fine.
Kurt
This can lead to incorrect values in SQL 2008 if the more precise datetime data types are used.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
April 28, 2009 at 8:23 am
TheSQLGuru (4/28/2009)
This can lead to incorrect values in SQL 2008 if the more precise datetime data types are used.
Not just 2008 - as Lynn has correctly observed!
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
April 28, 2009 at 8:43 am
I would note that in working with an existing legacy database in SQL Server 2000, where most datetime fields are _intended_ to be dates only (but which sometimes have erroneous non-midnight time components), that the least inefficient means we have so far found to guarantee accurate date-only data with existing fields is:
DATEADD(dd,DATEDIFF(d,0,@DateTime),0)
-- replace @DateTime with the datetime or smalldatetime field of choice.
-- Note that putting this in a function and then calling the function in your SQL is dramatically slower than hardcoding the statement in your SQL.
-- Note that the same thing can be done with any uses of GetDate() - drop it into a date-only variable at the top of your stored procedure.
Which matches the SQL in the Wild blog post earlier mentioned.
I would also note that for read-only lookup tables (bulk reporting additions to an OLTP database), we have added datetime fields to the end of the original tables with a CHECK constraint so that the new fields are midnight-guaranteed, and then indexed those new fields (yes, backwards compatibility with the original database table was important):
ALTER TABLE database.owner.NewTable
ADD CONSTRAINT CK_startdateDateOnly_NewTable
CHECK (startdateDateOnly = DATEADD(dd,DATEDIFF(d,0,startdate),0)) -- need DateOnly field because a time in the field will screw up "date based" compares (like GetDate()), since a "date" includes a time of midnight
No, I am not able to change the design of the legacy database.
April 28, 2009 at 8:50 am
Generally, when I'm using a stored procedure or a script that needs to retrieve data based on dates, I will use something like this at the top of the code, to ensure that the dates are consistent throughout:
DECLARE @Start_Date DATETIME
DECLARE @End_Date DATETIME
SET @Start_Date = '01/01/2009'
SET @End_Date = '03/31/2009'
SET @End_Date = DateAdd(dd, 1, @End_Date)
Sometimes the date variables are set by code in a procedure, sometimes by hand for manual queries, but this allows me to put in the logical date asked for by a user (all enrollments to a program in the first quarter of 2009, for example) without worrying about what the next day would be. I then use:
WHERE TableDate >= @Start_Date
AND TableDate < @End_Date
This should also allow SQL to determine what the execution plan will look like without having a function involved in the WHERE or ON clauses.
Just my little bit of input... Have a great day!
Dena Brown
April 28, 2009 at 12:00 pm
I like this example for the additional reason that it is explicit. Using techniques like between '2009-01-01' and 2009-01-02' to get dates that take place on the first hides its real intent when calling 01-02.
In other cases I also like to use datepart() and call out the month day and year
where datepart(yyyy, comparedate) = datepart(yyyy, referencedate)
and datepart(month, comparedate) = datepart(month, referencedate)
and datepart(day, comparedate) = datepart(day, referencedate)
If you have control of the schema I know that I want to compare on dates a lot I will put in a derived integer column that can be indexed, this will really speed things up.
columnname as cast(floor(cast(referencedate as float)) as int)
April 28, 2009 at 12:24 pm
the easiest way to extract date ranges without having to worry about the time parts is to subtract 1 day from the bottom of the range and add 1 day to top of the range
example:
if you want dates between 1/1/2009 and 1/31/2009
set query to test for greater than 12/31/2008 and less than 2/1/2009
April 28, 2009 at 12:44 pm
Josie (4/28/2009)
the easiest way to extract date ranges without having to worry about the time parts is to subtract 1 day from the bottom of the range and add 1 day to top of the rangeexample:
if you want dates between 1/1/2009 and 1/31/2009
set query to test for greater than 12/31/2008 and less than 2/1/2009
This is absolutely NOT correct!
12/31/2008 is actually '12/31/2008 00:00:00' in SQL Server. Thus you will INCLUDE ALL OF THAT DAY (except 00:00:00 exactly) in the result set with a filter of:
> '12/31/2008'
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
April 28, 2009 at 12:52 pm
spatley (4/28/2009)
I like this example for the additional reason that it is explicit. Using techniques like between '2009-01-01' and 2009-01-02' to get dates that take place on the first hides its real intent when calling 01-02.In other cases I also like to use datepart() and call out the month day and year
where datepart(yyyy, comparedate) = datepart(yyyy, referencedate)
and datepart(month, comparedate) = datepart(month, referencedate)
and datepart(day, comparedate) = datepart(day, referencedate)
If you have control of the schema I know that I want to compare on dates a lot I will put in a derived integer column that can be indexed, this will really speed things up.
columnname as cast(floor(cast(referencedate as float)) as int)
Based on the above, I am having a hard time know which is the date you want and which is the date you are comparing, so for the sake of argument, referencedate will be the value in the database and compare date is the date I want to query for in the where clause. Here is why, your where clause above will not allow you to use an index. This is how I would write the where clause:
where
referencedate >= dateadd(dd, datediff(dd, 0, @comparedate), 0) and -- Beginning of this day
referencedate < dateadd(dd, datediff(dd, 0, @comparedate) + 1, 0) -- Beginning of next day
This will allow the QO to use the index on referencedate, if it exists.
April 28, 2009 at 1:52 pm
Here's another method for removing the time from a datetime value I picked up from a colleague.
select dateadd(day,0,datediff(day,0,getdate()))
April 28, 2009 at 3:26 pm
craig.lovegren (4/28/2009)
guess we could always just run both methods over a largish table and get stats out of the query analyser (like cpu usage, etc)
I did exactly that, using the small example, you find that the datediff and dateadd are nearly identical, indexes or not. So I grabbed a large table (a few million rows) from our systems, indexed them appropriately and ran a few tests. When comparing my three tests, datediff ranked at 90%, dateadd, although a complex plan, took only 9%, and the final test took the remaining 1%. What we typically do for tables which are queried on a datetime field religiously is to actually create another column with the date portion only. It does require an extra 4 bytes (smalldatetime) per row, but disk space is cheap. Just remember to update the column after the data load in a batch update statement, don't use triggers, computed columns, or calcs in your inserts, as this will slow your data loads down drastically.
Percent of what? Percent of BATCH??? That absolutely cannot be trusted. Only actual cpu usage, duration, reads, and writes should be used to measure performance.
--Jeff Moden
Change is inevitable... Change for the better is not.
April 28, 2009 at 5:03 pm
Not sure why dates cause so much trouble...
declare
@date datetime,
@today datetime,
@tonight datetime
select
@date = getdate(),
@today = dateadd(dd,0,datediff(dd,0,@date)), 😉
@tonight = dateadd(ms,-3,@today+1) -- 0.003 is the smallest unit for datetime
select
@date,
@today,
@tonight
now rather than doing the '#temp_table' thing...(bah!)...
add a computed column using the calculations, these columns are deterministic so can be indexed.
eg.
Create Table DateTable (
inDate datetime,
inDay as dateadd(dd,0,datediff(dd,0,inDate)),
endDay as dateadd(ms,-3,dateadd(dd,0,datediff(dd,0,inDate))+1)
)
--endDay is messy as you can't reference another computed value.
now simply select on all the 'inDay' values you want...
or 'inDate between inDay and endDay'
So everyone should be able to cope with @today & @tonight - the same functionality can be extended to cater for any type of time-period most commonly 'last-month'
@start = dateadd(mm,-1,dateadd(dd,-1*(datepart(dd,@today)-1),@today))
@finish = dateadd(ms,-3,dateadd(mm,1,@start))
select * from DateTable where inDate between @start and @finish
Avoid using 'strings' as dates, too many things can go wrong...
April 28, 2009 at 5:09 pm
rob.lobbe (4/28/2009)
Not sure why dates cause so much trouble...declare
@date datetime,
@today datetime,
@tonight datetime
select
@date = getdate(),
@today = convert(datetime,convert(int,@date)),--use smallint for smalldatetime
@tonight = dateadd(ms,-3,@today+1) -- 0.003 is the smallest unit for datetime
select
@date,
@today,
@tonight
Did you check your output?
@date = '2009-04-28 16:06:36.970',
@today = '2009-04-29', -- oops, it's not tomorrow quite yet
@tonight = '2009-04-29 23:59:59.997' -- again, in the futureAfter noon, casting to an INT will round to the following day.
April 28, 2009 at 5:21 pm
craig.lovegren (4/28/2009)
rob.lobbe (4/28/2009)
Not sure why dates cause so much trouble...declare
@date datetime,
@today datetime,
@tonight datetime
select
@date = getdate(),
@today = dateadd(dd,0,datediff(dd,0,@date)), 😉
@tonight = dateadd(ms,-3,@today+1) -- 0.003 is the smallest unit for datetime
select
@date,
@today,
@tonight
Did you check your output?
@date = '2009-04-28 16:06:36.970',
@today = '2009-04-29', -- oops, it's not tomorrow quite yet
@tonight = '2009-04-29 23:59:59.997' -- again, in the futureAfter noon, casting to an INT will round to the following day.
Still morning where I am - missed it - Ta!
(and in the glorious tradition of community sharing I pinched a quick fix.)
April 28, 2009 at 5:27 pm
Jeff Moden (4/28/2009)
Percent of what? Percent of BATCH??? That absolutely cannot be trusted. Only actual cpu usage, duration, reads, and writes should be used to measure performance.
[p]Actually, it was an overall grade.[/p]Pseudo-code:
build a table containing test #, iteration #, and precomputed random date inside valid range, no time values
define a local, static, forward only, read only cursor to extract the dates one by one (I know, cursors, ick)
dbcc dropcleanbuffers
dbcc freeproccache
set statistics io on
set statistics time on
execute test:
#1: DATEDIFF( d, ((column)), ((testdate)) ) = 0
#2: ((column)) >= ((testdate)) AND ((column)) < DATEADD( d, 1, ((testdate)) )
#3: ((stored truncated date column)) = ((testdate))
set statistics io off
set statistics time off
loop to next iteration
The results were taken from the Output window, run through Perl, and shoved in to a table.
Oddly, the results I am getting now are different than this morning; I can only assume I didn't rebuild the indexes the same way I did this morning. Test #2 and #3 are showing almost identical results... hmmm.
Test 1: 5889 cpu, 17 scans, 78,030 logical reads
Test 2: 27 cpu, 1 scan, 696 logical reads
Test 3: 23 cpu, 1 scan, 672 logical reads
Viewing 15 posts - 46 through 60 (of 110 total)
You must be logged in to reply to this topic. Login to reply