January 20, 2014 at 6:14 pm
[h4]DateDiff tricks and the missing 147 years[/h4]
Data-type DateTime has the following properties:
It can represent any valid date/time between 1753-01-01 00:00:00.000 and 9999-12-31 23:9:59.998.
Declare
@st DateTime,
@et DateTime;
Select
@st='1753-01-01 00:00:00.000',
@et='9999-12-31 23:59:59.998';
Mess with the dates a bit an you will see that these two are the upper and lower boundaries.
Nothing new here, BOL tells us that.
It is internally represented by two 32-bit integers.
Declare
@st DateTime,
@et DateTime;
Select
@st='1753-01-01 00:00:00.000',
@et='9999-12-31 23:59:59.997';
Select
Cast(@st as VarBinary)[@st as Hex],
Cast(@et as VarBinary)[@et as Hex];
Select
SubString(Cast(@st as VarBinary),1,4)[@st 1st Int as Hex],
SubString(Cast(@st as VarBinary),5,4)[@st 2nd Int as Hex],
SubString(Cast(@et as VarBinary),1,4)[@et 1st Int as Hex],
SubString(Cast(@et as VarBinary),5,4)[@et 2nd Int as Hex]
Select
Cast(SubString(Cast(@st as VarBinary),1,4) as Int)[@st 1st Int as Int],
Cast(SubString(Cast(@st as VarBinary),5,4) as Int)[@st 2nd Int as Int],
Cast(SubString(Cast(@et as VarBinary),1,4) as Int)[@et 1st Int as Int],
Cast(SubString(Cast(@et as VarBinary),5,4) as Int)[@et 2nd Int as Int];
Oddly, the lowest date/time boundary (1753-01-01 00:00:00.00) is not internally code as two zeros. It is -53690 and 0. Also, notice that the highest date/time first integer is 2958463. In passing the second integer really has no bearing on my subject but I'll tell you that it stores the number of 3.33~ millisecond ticks since midnight for a given day.
Now run this:
Select
DateAdd(dd,53690,'1753-01-01 00:00:00.000');
And we get 1900-01-01 00:00:00.000. This is our reference point around which the first of the two internal integers is composed; it represent the arithmetic difference in days between 1900-01-01 00:00:00.000 and your other date/time value. This is why, internally, it can be negative or positive. This is also the source of another odd T-SQL quirk (like this is the first one:w00t:).
[Code]
Declare
@st DateTime,
@et DateTime;
Select
@st='1753-01-01 00:00:00.000',
@et='9999-12-31 23:59:59.998';
Select
DateDiff(yy,@st,@et);
[/code]
You get back 8246. No surprises here, this is just the number of years between our valid lowest and highest dates. Now run:
Declare
@st DateTime,
@et DateTime;
Select
@st='1753-01-01 00:00:00.000',
@et='9999-12-31 23:59:59.998';
Select
DateDiff(yy,0,@et-@st);
Huh:crazy:, the dreaded Arithmetic overflow error converting expression to data type datetime. But if you run:
Declare
@st DateTime,
@et DateTime;
Select
@st='1900-12-31 00:00:00.000',
@et='9999-12-31 23:59:59.998';
Select
DateDiff(yy,0,@et-@st);
Everything is ok, you get 8099, the number of years between those two date/time values.
So what happened here that we apparently lost 147 years of range (1900-1753) by using the DateDiff function in this manor? Well, the DateDiff function wants a DateTime value as it's second and third parameters so it can do the difference calculation with the knowledge of its internal days representation. When you pass 0 as the second parameter you are thwarting this process by calculating the difference yourself and possibly getting caught by the zero crossing problem!
Example:
1753-01-01 is internally represented by a days number of -53690 (the smallest valid days number)
1900-01-01 is internally represented by a days number of 0
9999-01-01 is internally represented by a days number of 2958463 (the largest valid days number)
Ex.1) If you take the arithmetic difference (using the binary minus operator) between 9999-01-01 and 1900-01-01 as days you'd get 2958099
Ex.2) If you take the arithmetic difference (using the binary minus operator) between 9999-01-01 and 1753-01-01 as days you'd get 3012153
Now, I'm going to tell you that this problem actually has nothing to do directly with the DateDiff function! When you take the arithmetic difference of two DateTime values the result is implicitly cast back into a DateTime value. Therefore, Ex.1 above would have no problem being passed into DateDiff because it will cast back into a DateTime data type. However, Ex.2 runs into the Arithmetic overflow error converting expression to data type datetime because 3012153 is larger than the highest internal days number!
So, we can see that DateDiff knows how to handle the difference calculation when one of its DateTime values has a negative representation, where an arithmetic difference doesn't! So, should we avoid using DateDiff in this manor? Of course not! Its solves some rounding problems for us when we do! Just use it with an awareness of what is going on under the covers!
Your 147 years have been returned!
April 8, 2015 at 11:05 am
Thanks, Jeff, very useful! I ran in to the datetime2 issue just yesterday while trying what turned out to be an unnecessary modification to my audit system. It's kind of odd: I don't recall doing a lot of datediff work over the years until this job where we need awareness of expired eye exams and such.
-----
[font="Arial"]Knowledge is of two kinds. We know a subject ourselves or we know where we can find information upon it. --Samuel Johnson[/font]
April 9, 2015 at 4:55 pm
Thanks everyone for all the good comments and ideas.
April 15, 2015 at 9:04 am
The (very small) fly in the ointment is that the 114 date style code in the CONVERT function returns milliseconds preceded by a colon rather than a period (decimal point).
That's always looked odd to my eyes.
Microsoft's definition of valid string literals for the time datatype includes the following:
Milliseconds can be preceded by either a colon (:) or a period (.). If a colon is used, the number means thousandths-of-a-second. If a period is used, a single digit means tenths-of-a-second, two digits mean hundredths-of-a-second, and three digits mean thousandths-of-a-second. For example, 12:30:20:1 indicates 20 and one-thousandth seconds past 12:30; 12:30:20.1 indicates 20 and one-tenth seconds past 12:30.
BOL gives the 114 format as hh:mi:ss:mmm, and indeed three digits are always returned for milliseconds, with leading zeros if needed.
That effectively makes the colon and period interchangeable, so I struggle to see the point of the 113 and 114 codes.
Do they relate to any ANSI standard?
April 16, 2015 at 4:11 pm
Gary Harding (4/15/2015)
The (very small) fly in the ointment is that the 114 date style code in the CONVERT function returns milliseconds preceded by a colon rather than a period (decimal point).That's always looked odd to my eyes.
Microsoft's definition of valid string literals for the time datatype includes the following:
Milliseconds can be preceded by either a colon (:) or a period (.). If a colon is used, the number means thousandths-of-a-second. If a period is used, a single digit means tenths-of-a-second, two digits mean hundredths-of-a-second, and three digits mean thousandths-of-a-second. For example, 12:30:20:1 indicates 20 and one-thousandth seconds past 12:30; 12:30:20.1 indicates 20 and one-tenth seconds past 12:30.
BOL gives the 114 format as hh:mi:ss:mmm, and indeed three digits are always returned for milliseconds, with leading zeros if needed.
That effectively makes the colon and period interchangeable, so I struggle to see the point of the 113 and 114 codes.
Do they relate to any ANSI standard?
My greatest disappointment is that MS doesn't have a datetime conversion for Pig Latin. π
--Jeff Moden
Change is inevitable... Change for the better is not.
April 16, 2015 at 5:45 pm
Jeff Moden (4/16/2015)
My greatest disappointment is that MS doesn't have a datetime conversion for Pig Latin. π
Easy enough.
DECLARE @MyDate DATETIME = '2014-01-01';
SELECT UPPER(SUBSTRING(ds,2,1)) + SUBSTRING(ds,3,1) + LOWER(LEFT(ds,1)) + 'ay' + SUBSTRING(ds, 4, 99)
FROM (SELECT CAST(@MyDate AS VARCHAR(30))) a (ds);
My thought question: Have you ever been told that your query runs too fast?
My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.
Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
Since random numbers are too important to be left to chance, let's generate some![/url]
Learn to understand recursive CTEs by example.[/url]
[url url=http://www.sqlservercentral.com/articles/St
April 16, 2015 at 5:57 pm
Ummm.... are you sure that's ANSI Pig Latin??? π
--Jeff Moden
Change is inevitable... Change for the better is not.
April 16, 2015 at 6:49 pm
Jeff Moden (4/16/2015)
Ummm.... are you sure that's ANSI Pig Latin??? π
Not sure. Where's Joe Celko when you need him?
My thought question: Have you ever been told that your query runs too fast?
My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.
Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
Since random numbers are too important to be left to chance, let's generate some![/url]
Learn to understand recursive CTEs by example.[/url]
[url url=http://www.sqlservercentral.com/articles/St
April 17, 2015 at 8:06 am
Jeff Moden (4/16/2015)
My greatest disappointment is that MS doesn't have a datetime conversion for Pig Latin. π
It's in Alpha testing right now and will release next year on April 1. π
-----
[font="Arial"]Knowledge is of two kinds. We know a subject ourselves or we know where we can find information upon it. --Samuel Johnson[/font]
April 17, 2015 at 9:02 am
Wayne West (4/17/2015)
Jeff Moden (4/16/2015)
My greatest disappointment is that MS doesn't have a datetime conversion for Pig Latin. πIt's in Alpha testing right now and will release next year on April 1. π
Maybe then everyone will have an Equal Say π
April 19, 2015 at 11:51 am
Gary Harding (4/17/2015)
Wayne West (4/17/2015)
Jeff Moden (4/16/2015)
My greatest disappointment is that MS doesn't have a datetime conversion for Pig Latin. πIt's in Alpha testing right now and will release next year on April 1. π
Maybe then everyone will have an Equal Say π
But only after a couple of Service Packs so they can get it right. π Remember SUM() OVER and MERGE. π
--Jeff Moden
Change is inevitable... Change for the better is not.
December 3, 2015 at 6:01 am
Occasionally there's a requirement for elapsed time as years, months, days, hours, minutes, seconds and milliseconds, such that if you were to add all of these time fractions to the original start datetime using DATEADD, you'd arrive at the end datetime.
Here it is:
CREATE FUNCTION [dbo].[iTVF_ElapsedTime]
/*
Calculate the elapsed time between two datetimes
as year, month, day, hour, minute, second, millisecond
such that adding these values using DATEADD to the earlier
value will yield the later value.
*/
(@Then DATETIME, @Now DATETIME)
RETURNS TABLE WITH SCHEMABINDING AS
RETURN
SELECT
y.Years,
mo.Months,
r.[Days],
r.[Hours],
r.[Minutes],
r.Seconds,
r.Milliseconds
FROM (
SELECT
DATESTRING = CONVERT(VARCHAR(8),@Then,112)+' '+CONVERT(VARCHAR(12),@Then,14),
PARMSTRING = CONVERT(VARCHAR(8),@Now,112)+' '+CONVERT(VARCHAR(12),@Now,14)
) ds
CROSS APPLY (SELECT [Years] = DATEDIFF(YEAR,@Then,@Now) - CASE WHEN SUBSTRING(DATESTRING,5,17) > SUBSTRING(PARMSTRING,5,17) THEN 1 ELSE 0 END) y
CROSS APPLY (SELECT [YearAdjDate] = DATEADD(YEAR,y.[Years],@Then)) y4
CROSS APPLY (SELECT [Months] = DATEDIFF(MONTH,y4.YearAdjDate,@Now) - CASE WHEN SUBSTRING(DATESTRING,7,15) > SUBSTRING(PARMSTRING,7,15) THEN 1 ELSE 0 END) mo
CROSS APPLY (
SELECT
[Days] = DATEDIFF(DAY,DATEADD(MONTH,mo.[Months],y4.YearAdjDate),@Now) - CASE WHEN SUBSTRING(DATESTRING,9,13) > SUBSTRING(PARMSTRING,9,13) THEN 1 ELSE 0 END,
[Hours] = DATEPART(HOUR,@Now-@Then),
[Minutes] = DATEPART(MINUTE,@Now-@Then),
[Seconds] = DATEPART(SECOND,@Now-@Then),
[Milliseconds] = DATEDIFF(MILLISECOND,DATEADD(SECOND,(CASE WHEN DATEPART(MILLISECOND,@Then) > DATEPART(MILLISECOND,@Now) THEN -1 ELSE 0 END),DATEADD(MILLISECOND,DATEPART(MILLISECOND,@Then),DATEADD(MILLISECOND,0-DATEPART(MILLISECOND,@Now),@Now))),@Now)
) r
GO
and here's a simple test harness for it:
-- Testing
-- 42,550 rows / 00:00:01 including generating sample data,
--reconstituting now from then and output to screen
DECLARE @Now DATETIME = GETDATE();
WITH SampleData AS (
SELECT TOP(211+DATEDIFF(DAY,'19000101',GETDATE()))
[Then] = DATEADD(MILLISECOND,ABS(CHECKSUM(NEWID()))%86400000,DATEADD(DAY, ROW_NUMBER() OVER(ORDER BY (SELECT NULL))-1,'19000101'))
FROM
(VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) d (n),
(VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) e (n),
(VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) f (n),
(VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) g (n),
(VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) h (n)
)
SELECT
s.[Then],
[Now] = @Now,
q.*,
c.Calc
FROM SampleData s
CROSS APPLY dbo.iTVF_ElapsedTime (s.[Then], @Now) q
-- include this for testing only
CROSS APPLY (
SELECT Calc =
DATEADD(MILLISECOND,q.[Milliseconds],
DATEADD(SECOND,q.[Seconds],
DATEADD(MINUTE,q.[Minutes],
DATEADD(HOUR,q.[Hours],
DATEADD(DAY,q.[days],
DATEADD(MONTH,q.months,
DATEADD(YEAR,q.Years,s.[Then]))))))) ) c
--WHERE @Now <> Calc
For fast, accurate and documented assistance in answering your questions, please read this article.
Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
June 22, 2023 at 8:30 pm
Theyβre also a delight to the scientific and other communities because the range of dates (although I believe it uses the wrong calendar prior to the year 1753) has been expanded to include as far back as January 1, 1 AD
That depends where in the world you live.Β The Gregorian calendar replaced the Julian calendar in Catholic areas in 1582.Β This required the "loss" of ten days for those affected.Β England had broken away from the Catholic Church by this time and did not adopt the calendar until September (?) 1753, by this time requiring the "loss" of eleven days.Β Russia did not adopt the calendar until after the Russian Revolution.
Viewing 13 posts - 46 through 57 (of 57 total)
You must be logged in to reply to this topic. Login to reply