July 24, 2017 at 10:46 pm
Comments posted to this topic are about the item The Current Date
July 24, 2017 at 10:47 pm
Nice one, thanks Steve
Learned something new
July 24, 2017 at 10:48 pm
Good question thanks Steve
____________________________________________
Space, the final frontier? not any more...
All limits henceforth are self-imposed.
βlibera tute vulgaris exβ
July 24, 2017 at 11:48 pm
Nice easy one, thanks!
July 25, 2017 at 7:20 am
For most of my use cases, DATETIME2 seems to make the most sense. The increased accuracy and smaller storage footprint makes sense.
I was enlightened by the DATETIME v DATETIME2 debate when it comes to manipulation & comparison of date/time values. The amount of conversions from DATETIME2 to DATETIME surprised me.
July 25, 2017 at 9:22 am
Kaye Cahs - Tuesday, July 25, 2017 7:20 AM56656
For most of my use cases, DATETIME2 seems to make the most sense. The increased accuracy and smaller storage footprint makes sense.
I was enlightened by the DATETIME v DATETIME2 debate when it comes to manipulation & comparison of date/time values. The amount of conversions from DATETIME2 to DATETIME surprised me.
You save only one byte to get 3 digits of precision. Yes, it's to the milli-second rather than 3.3 milliseconds but my question then becomes, what is it that you're doing that requires more precision than the nearest 1/300th of a second?
For that 1 byte of savings, you give up being able to do direct date math for things like duration calculations between a start and end date and you certainly lose the ability to easily sum such durations because of silly restrictions that MS put on the DATETIME(2) datatype. For example...
Msg 529, Level 16, State 2, Line 3
Explicit conversion from data type datetime2 to float is not allowed.
That also flies in the face of useful ANSI standards where it states that EndDate-StartDate=Interval. Even bloody Excel can do that much.
Don't think for a minute that MS doesn't know the problem they've created. Rather than fix the root problem, they've come up with DATEDIFF_BIG(), which still has limits that can be overwhelmed if working with higher precision.
--Jeff Moden
Change is inevitable... Change for the better is not.
July 25, 2017 at 9:24 am
Heh... I can only imagine where such a question may have come from. π
--Jeff Moden
Change is inevitable... Change for the better is not.
July 25, 2017 at 12:52 pm
At the PASS summit, they should hand out SQL Server DBA branded digital watches displaying the current datetime2 formatted as an ISO 8601 string.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
July 26, 2017 at 1:52 am
nice question
Manik
You cannot get to the top by sitting on your bottom.
August 1, 2017 at 5:23 am
I didn't know it had changed, so I learned something new. Thanks, Steve.
August 1, 2017 at 8:48 am
Crap, I choose the wrong one((((
August 7, 2017 at 1:14 am
I see comments in this post trying to justify the old legacy DateTime Data Type. It's just wrong ! Why would you put up with its vagueness when it comes to precision ?
August 8, 2017 at 11:50 am
David Conn - Monday, August 7, 2017 1:14 AMI see comments in this post trying to justify the old legacy DateTime Data Type. It's just wrong ! Why would you put up with its vagueness when it comes to precision ?
Because most people don't even need precision to the second never mind the precision to 100 usec. Another reason is that the DATETIME datatype is more useful because it does allow for direct date math (even Excel allows that) whereas the "new" datatypes do not. Saving a byte or two or three to give that up is, as you say, just wrong.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 10, 2017 at 5:38 pm
GETDATE() can also be much faster at scale. If you're just looking for days/hours/minutes/seconds, why not use datetime?
Viewing 15 posts - 1 through 15 (of 16 total)
You must be logged in to reply to this topic. Login to reply