September 10, 2018 at 9:03 am
Jeff Moden - Monday, September 10, 2018 8:56 AMHeh... now... about those nano-seconds. 😀
If I write one with DATEDIFF_BIG in someone might end up waiting a long time for the function to return 9,223,372,036,854,775,807 rows 😀😀
September 11, 2018 at 12:05 pm
Thom A - Monday, September 10, 2018 8:13 AMI think part of what everyone's point is, is why did they introduce DATEDIFF_BIG instead of changing the return datatype of DATEDIFF on more recent version of SQL Server? They could have simply listed the change in the breaking changes and then people would be aware to cater for it on more recent versions if it was an issue. I'm sure theyv'e changed the return datatype of other functions before (none spring to mind right now though), so why not here.
There is tremendous opposition to breaking changes. Those cause large customers not to upgrade, which impacts sales. Therefore, no breaking changes.
We have to then deal with "2" objects, _BIG objects, etc.
September 11, 2018 at 9:30 pm
Steve Jones - SSC Editor - Tuesday, September 11, 2018 12:05 PMThom A - Monday, September 10, 2018 8:13 AMI think part of what everyone's point is, is why did they introduce DATEDIFF_BIG instead of changing the return datatype of DATEDIFF on more recent version of SQL Server? They could have simply listed the change in the breaking changes and then people would be aware to cater for it on more recent versions if it was an issue. I'm sure theyv'e changed the return datatype of other functions before (none spring to mind right now though), so why not here.
There is tremendous opposition to breaking changes. Those cause large customers not to upgrade, which impacts sales. Therefore, no breaking changes.
We have to then deal with "2" objects, _BIG objects, etc.
Changing the newer temporal datatypes to have the same good functionality that the DATETIME datatype has would have broken nothing. And, I think if they introduced the DATEDIFF_BIG with the explanation of "and if you still want to do it the other way, here's an additional tool", it would have been labeled as marketing genius. They could still save face if they fixed the newer datatypes because it wouldn't break anything.
It would also be appropriate for them to fess up and say that dates prior to 1753 won't actually match what the calendar was before then although I don't actually have a need for any dates prior to 1899.
--Jeff Moden
Change is inevitable... Change for the better is not.
September 13, 2018 at 12:47 am
Jeff Moden - Tuesday, September 11, 2018 9:30 PMChanging the newer temporal datatypes to have the same good functionality that the DATETIME datatype has would have broken nothing. And, I think if they introduced the DATEDIFF_BIG with the explanation of "and if you still want to do it the other way, here's an additional tool", it would have been labeled as marketing genius. They could still save face if they fixed the newer datatypes because it wouldn't break anything.It would also be appropriate for them to fess up and say that dates prior to 1753 won't actually match what the calendar was before then although I don't actually have a need for any dates prior to 1899.
I just wish they'd sort the performance differential between the new and old types, I can live with DATEDIFF_BIG if I have to (although I don't really see why it was needed over just changing DATEDIFF) and I don't really like the ability to add an integer to a date using the + operator regardless of what the standards might say - it's like adding 1 to an Apple, they aren't the same thing - it really isn't obvious why the integer value should me days as opposed to hours, months, secs, whatever.
September 13, 2018 at 6:59 am
andycadley - Thursday, September 13, 2018 12:47 AMI just wish they'd sort the performance differential between the new and old types, I can live with DATEDIFF_BIG if I have to (although I don't really see why it was needed over just changing DATEDIFF) and I don't really like the ability to add an integer to a date using the + operator regardless of what the standards might say - it's like adding 1 to an Apple, they aren't the same thing - it really isn't obvious why the integer value should me days as opposed to hours, months, secs, whatever.
It's pretty obvious to me that +1 means adding 1 day but that's not the big thing. Without using DATEDIFF_BIG, take two date/times that are more than 25 days apart and calculate the period between them down to the nearest 3.3 ms using DATETIME2 and then I'll show you how to do it using DATETIME and you'll see what I mean. An equally interesting fire drill is to add that period to the original start date.
I agree that it's a bit contrived because most people won't need to have such accuracy over such a long period but it does clearly demonstrate the problem I'm talking about.
--Jeff Moden
Change is inevitable... Change for the better is not.
September 14, 2018 at 8:04 am
Jonathan AC Roberts - Saturday, September 8, 2018 7:04 PMIt's not long in days but it is long in milliseconds. I did also provide the facility in that function to get all the years between two dates which won't even take you back to when the earth was created..
Actually, for years, it's irrelevant. Any year prior to 1753 is effectively useless, as you'd have to decide whose year it was supposed to be in order for the value to have meaning as anything other than just another integer... Such is the nature of human behavior in the past.... Where time keeping was concerned, we simply did not have the capability to track it accurately, and frequently allowed personal bias and politics to influence such decisions in different ways in different parts of the world. That "hot mess" simply cannot be sorted prior to January 1, 1753. And for all practical purposes, you start going back to the age of the dinosaurs, and what month of the year it happens to be probably wouldn't matter in the slightest... You'd have much more immediate survival issues with the existence of velociraptors and even establishing a viable food source that you don't have to compete with 2 ton animals with an attitude, to get to.
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
September 14, 2018 at 8:56 am
Jeff Moden - Thursday, September 13, 2018 6:59 AMIt's pretty obvious to me that +1 means adding 1 day but that's not the big thing. Without using DATEDIFF_BIG, take two date/times that are more than 25 days apart and calculate the period between them down to the nearest 3.3 ms using DATETIME2 and then I'll show you how to do it using DATETIME and you'll see what I mean. An equally interesting fire drill is to add that period to the original start date.
I agree that it's a bit contrived because most people won't need to have such accuracy over such a long period but it does clearly demonstrate the problem I'm talking about.
Sure, but the contrived aspect of that is precisely why I say I can live with having to use DATEDIFF_BIG for longer time periods with very granular precision (and it is precision, not accuracy here).
The performance differential is harder to avoid and more frustrating, particularly when you have to deal with the Bank of England Base Rate going back before 1900, as I do. There is a period of history after the weirdness of 1753 that DateTime can't cope with.
September 14, 2018 at 8:56 am
sgmunson - Friday, September 14, 2018 8:04 AMJonathan AC Roberts - Saturday, September 8, 2018 7:04 PMJeff Moden - Saturday, September 8, 2018 3:49 PMDATEDIFF_BIG is the bad joke that MS came up with because they royally screwed up when they created the relatively new DATETIME2, DATE, and TIME datatypes, which can no longer do direct date math such as Period = Enddatetime - Startdatetime, which the DATETIME datatype does quite nicely and IS both an ANSI and ISO standard for how dates, times, and periods are supposed to be handled.Also, 2,147,483,647 milliseconds isn't actually a very long time at all. It's only 25 days + 20:31:23.647 (hh:mi:ss.mil).
It's not long in days but it is long in milliseconds. I did also provide the facility in that function to get all the years between two dates which won't even take you back to when the earth was created..
Actually, for years, it's irrelevant. Any year prior to 1753 is effectively useless, as you'd have to decide whose year it was supposed to be in order for the value to have meaning as anything other than just another integer... Such is the nature of human behavior in the past.... Where time keeping was concerned, we simply did not have the capability to track it accurately, and frequently allowed personal bias and politics to influence such decisions in different ways in different parts of the world. That "hot mess" simply cannot be sorted prior to January 1, 1753. And for all practical purposes, you start going back to the age of the dinosaurs, and what month of the year it happens to be probably wouldn't matter in the slightest... You'd have much more immediate survival issues with the existence of velociraptors and even establishing a viable food source that you don't have to compete with 2 ton animals with an attitude, to get to.
Agreed but not irrelevant because people that don't know about things like the history of the Gregorian Calendar, etc, nor even about the fact that different calendars existed still try to use the YEAR datepart prior to 1753-01-01. Fortunately (I think/hope) people that DO have to work with ancient dates know better than to use SQL Server for anything that requires historically accurate dates depending on what calendar system was in use at the time.
The other reason why it's not irrelevant is because so many drank the MS purple Kool-Aidâ„¢ about the newer temporal datatypes being ANSI/ISO compliant and they're actually not because they don't allow for direct period calculations and the like.
And who would think that an EOMONTH function would be useful but not create an FOMONTH (first of month) function? Who would think that creating an ISOWEEK datepart would be useful without including the year according to ANSI/ISO standards? And then there's the "sargability" thing for the DATE datatype where they almost made it right (still a bit of a performance issue) when trying to compare whole dates to dates with times? Heh... and how 'bout that nanosecond date part? I'm sure that tons of people use that, right? The people that do need such accuracy generally need more than the 100 nanosecond accuracy and so they don't use that either.
But that's not what you meant by "irrelevant" and so I agree with you... The YEAR datepart prior to 1753 is irrelevant because MS didn't account for calendar changes. DATEDIFF_BIG would also be irrelevant if MS did things right from the git instead of a quick release of crippled functionality to solve marketing issues. It's the same thing they did with PIVOT, FORMAT, etc, etc, etc. And, still, BOL doesn't ever suggest that the proper way to constrain a whole-date range (even if it's for just one day) is to use the ol' Somedatetimecolumn >= @StartDT AND Somedatetimecolumn < @EndDT+1. I continue to be amazed at the number of posts that insist the right way is to use Somedatetimecolumn BETWEEN @StartDT and CAST((CAST(CAST(@EndDT AS DATE) AS VARCHAR(xx))+'23:59:59') AS DATETIME2(x)) or some other equally insane/overly complex and sometimes erroneous method (for example, 23:59:59 leaves out nearly a whole second of the day).
Heh... can you tell that you've hit a couple of hotspots for me? 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
September 14, 2018 at 9:03 am
+1 billion on FOMONTH. How on earth was that not so obviously needed for exactly the same reason as EOMONTH? Grrrr.
September 14, 2018 at 9:10 am
andycadley - Friday, September 14, 2018 9:03 AM+1 billion on FOMONTH. How on earth was that not so obviously needed for exactly the same reason as EOMONTH? Grrrr.
Surely a FOMONTH function would return the same date for every month i.e. the 1st?
September 14, 2018 at 9:35 am
Jonathan AC Roberts - Friday, September 14, 2018 9:10 AMSurely a FOMONTH function would return the same date for every month i.e. the 1st?
EOMONTH returns the end of month as a date, so 3rd Sept 2018 returns 30th Sept 2018. A FOMONTH would do the same, i.e.give 1st Sept 2018. Admittedly doing that with maths is easier, but offering one convenience function and omitting the other is crazy.
September 14, 2018 at 10:03 am
andycadley - Friday, September 14, 2018 9:03 AM+1 billion on FOMONTH. How on earth was that not so obviously needed for exactly the same reason as EOMONTH? Grrrr.
The day for the end of the month varies depending on what month it is and secondarily on whether it is a leap year. The first of the month is always 1. That difference in variability is exactly the reason that one has a function and the other does not.
Drew
J. Drew Allen
Business Intelligence Analyst
Philadelphia, PA
September 14, 2018 at 10:25 am
drew.allen - Friday, September 14, 2018 10:03 AMThe day for the end of the month varies depending on what month it is and secondarily on whether it is a leap year. The first of the month is always 1. That difference in variability is exactly the reason that one has a function and the other does not.Drew
But writing DATEFROMPARTS(DatePart(Year, @d), DatePart(Month, @d), 1) to do what should just be FOMONTH(@d) is laborious and error prone for no good reason. Yes the first of the month is easier to get to, but if you're adding one function adding the other seems like common sense.
September 14, 2018 at 10:49 am
andycadley - Friday, September 14, 2018 10:25 AMBut writing DATEFROMPARTS(DatePart(Year, @d), DatePart(Month, @d), 1) to do what should just be FOMONTH(@d) is laborious and error prone for no good reason. Yes the first of the month is easier to get to, but if you're adding one function adding the other seems like common sense.
I have a greater need for FOMONTH than EOMONTH actually since the majority of the time I need <somedatecolumn> >= <FirstOfTheMonth and <somedatecolumn> <FirstOfNextMonth>.
September 14, 2018 at 10:53 am
andycadley - Friday, September 14, 2018 9:03 AM+1 billion on FOMONTH. How on earth was that not so obviously needed for exactly the same reason as EOMONTH? Grrrr.
Doesn't anyone realize that DATEFROMPARTS(YEAR(SomeDateValue), MONTH(SameDateValue), 1) is the functional equivalent ? 😀😀😀
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
Viewing 15 posts - 31 through 45 (of 59 total)
You must be logged in to reply to this topic. Login to reply