December 30, 2015 at 3:44 pm
drew.allen (12/30/2015)
Jeff Moden (12/30/2015)
Generating data to the ms across a large range of dates is very important to testing.I would say that depends on what you're testing and the granularity of the expected results. In this case, we're looking at timecard data. Here I expect the granularity to be no smaller than 1/4 hour, so having data to the nearest minute should be sufficient, but having data to the nearest second is not unreasonable. Requiring data to the nearest ms is completely unreasonable, especially since--as you already mentioned--most timecard systems don't even record data to the nearest ms.
Drew
But now you're arguing the example instead of the principle problem. There are a lot of different things that might require a full datetime listing across many years especially when it comes to the generation of realistic test data. And, my point has been made. Even you thought it impossible to do without a trip through DATETIME world (although it's patently not).
Would it better if I changed the problem definition to not say "Time Card" and just change it to a "million events with a StartDT and EndDT representing some duration from 4 to 18 hours with a 1/300th of a second resolution" so that we can stop arguing trivial semantics surrounding the nature of Time Cards?
I wrote my solution last night but I'm still at work and it's at home. I'll post it after I get home.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 30, 2015 at 9:33 pm
FridayNightGiant (12/30/2015)
Thought I would have a go at your challenge Jeff, just for fun as it is quiet in work.Looking forward to any critiques - not sure if I went the correct way with randomising the dates.
I didn't randomise milliseconds.
Create Table #TimeCardHistory
(#TimeIdint identity(1,1) not null
,StartDtdatetime not null
,EndDtdatetime not null
);
with cte as (
select 0 as Number union all
select 0 union all
select 0 union all
select 0 union all
select 0 union all
select 0 union all
select 0 union all
select 0 union all
select 0 union all
select 0),
cte1 as (
select 0 as number from cte c1
cross join cte c2
cross join cte c3
cross join cte c4
cross join cte c5
cross join cte c6)
insert into #TimeCardHistory(StartDt, EndDt)
selectdateadd(day,-RandDay,dateadd(SECOND,-RandSecond,GETDATE())) as StartDt
,dateadd(day,-RandDay,dateadd(second,-RandSecond,dateadd(hour,iif(RandEndHour <4,4,RandEndHour),dateadd(minute, RandEndMinute,dateadd(second,RandEndSecond,GETDATE()))))) As EndDt
from cte1
cross join (select ABS(CHECKSUM(NEWID()) % 5475) as RandDay
,ABS(CHECKSUM(NEWID()) % 86400) as RandSecond
,ABS(CHECKSUM(NEWID()) % 18) as RandEndHour
,ABS(CHECKSUM(NEWID()) % 60) as RandEndMinute
,ABS(CHECKSUM(NEWID()) % 60) as RandEndSecond) Rnd;
selectSum(DATEDIFF(ms,startDt, EndDt) / 3600000.0) as TotalHours
from#TimeCardHistory;
Thanks for the try, the post, the code, and the time you took to put it together. I very much appreciate it especially since you just up and volunteered the effort. I also appreciate the Itzik-style "pseudo-cursor" you used as a row source for the table population.
Let's check out the results. First, let's see what we get for the min and max dates and times...
--===== Check for min and max dates
SELECT MinStartDT = MIN(StartDT)
,MaxStartDT = MAX(StartDT)
,MinEndDT = MIN(EndDT)
,MaxEndDT = MAX(EndDT)
FROM #TimeCardHistory
;
Here are the results from when I ran your code...
MinStartDT MaxStartDT MinEndDT MaxEndDT
----------------------- ----------------------- ----------------------- -----------------------
2001-01-02 21:59:47.650 2015-12-30 21:24:30.650 2001-01-03 03:12:57.650 2015-12-31 12:22:45.650
I'm was thinking that the start date was missed by more than a year but ran your table population multiple times to rule out any randomness that cause it. It didn't matter. I always got something from 2001 instead of 2000. The problem turns out to be the following line...
ABS(CHECKSUM(NEWID()) % 5475) AS RandDay
5475 is approximately 15 years duration. There are actually 16 years of duration... all of 2001 thru all of 2015 and all of 2000 for a total of 16 years. What you might want to consider is using DATEDIFF to calculate the difference in days using the given start date and the given end date and then use DATEADD to add one day... or just use the end date - the start date + 1 if both are of the DATETIME datatype.
That brings us to another problem. You subtracted RandDay from GETDATE(). That means that the domain of days actually changes. That may or may not be a good thing depending on what you're testing for though, in this case, it violates the given requirements. In just a couple of days, the largest end date will be in 2016 instead of 2015 and the smallest start date will be even further from 2000-01-01. Since you do understand how to use NEWID() to act as a seed for RAND, I'm not sure why you went the GETDATE() route. Once you got the number of random days right, adding the random date parts you calculated to the given start date would have done the trick for the StartDT column.
For the 4 hour thing you did with the IIF decision, let's see what that does to the random "curve", which should be mostly flat...
--===== Determine the number of samples per duration hour.
-- They should be relatively flat due to the pseudo-randomness of NEWID().
SELECT Hr = DATEDIFF(mi,StartDT,EndDT)/60, TheCount = COUNT(*)
FROM #TimeCardHistory
GROUP BY DATEDIFF(mi,StartDT,EndDT)/60
ORDER BY Hr
;
Here's the result set from that...
Hr TheCount
----------- -----------
4 274707
5 57209
6 55789
7 55586
8 55898
9 56041
10 55554
11 55649
12 55502
13 55457
14 55900
15 55386
16 55347
17 55500
18 475
First, out of the 1,000,000 samples, more than a 1/4 of them fall into the 4 hour duration category. No matter how many times you run the table population, the relative order of magnitude is always the same. Not exactly considered to be random and could affect testing depending on what you're testing for. It could also be an unexpected useful bit. Since you included 18 hours, it shows up as the expected rare event.
With no reflection on you, I always get a kick out of people that say that direct DATETIME math involves the possibility of too many errors but look at all the direct temporal math you had to do in the form of knowing how many days were in the date range, knowing how many seconds there are in a day, and you still had to use the NEWID() trick on each datepart you used to produce a random value for the datepart not to mention that you had to also know that there are 3.6 million milliseconds in an hour. Lots of room for folks less temporally inclined to make errors there. Lots and lots of code, as well.
On the last part (the SUM of all the durations), you made the realization that no row would have a duration in ms that would cause an arithmetic overflow. For this example, it of course works. If it weren't a simple timecard example limited to a max of 18 hours, it would fail if the duration was in excess of 24 Days, 20 Hours, 31 Minutes, 23 Seconds, and 647 Milliseconds (2,147,483,647 ms, the max that can be produced by DATEDIFF without error). Some don't consider that to be a problem but I do only because I know that people looking for a solution for a similar problem might use the same code for something else. I think of it as a form of technical debt that hasn't happened yet.
Great try and thank you again for your effort. It gave me some ideas for other things.:-D
--Jeff Moden
Change is inevitable... Change for the better is not.
December 30, 2015 at 11:12 pm
Alright... it looks like Drew isn't going to bite on this one.
Here's the code I wrote last night with the validation checks that I added today. Details are in the comments in the code.
--=================================================================================================
-- Controls and presets
--=================================================================================================
--===== Create "parameter" variables for the start and end years.
-- Note that I'm not doing any error checking for this example
-- but would for production code.
DECLARE @pStartYear DATETIME
,@pEndYear DATETIME
,@pMinHours INT
,@pMaxHours INT
,@pDesiredRows INT
;
SELECT @pStartYear = '2000' --Resolves to 2000-01-01
,@pEndYear = '2015' --Resolves to 2015-01-01
,@pMinHours = 4
,@pMaxHours = 18
,@pDesiredRows = 1000000
;
--===== Add 1 to the end year so we can use a Closed/Open Temporal Range
-- to simplify our calculations.
SELECT @pEndYear = DATEADD(yy,1,@pEndYear) --Resolves to 2016-01-01 for this example
; --and will be "non-inclusive" or "temporaly open".
--=================================================================================================
-- Create the table to meet the given requiments and populate it on-the-fly.
-- Even if we hardcoded the values, this wouldn't get much longer or complicated.
-- It's part of why I love the DATETIME datatype and hate how they've crippled
-- the newer temporal datatypes. Compare this to the other code written on this post.
--=================================================================================================
--===== Create and populate the test table on-the-fly
WITH
cteStartDateTime(StartDT) AS
(
SELECT TOP (@pDesiredRows)
RAND(CHECKSUM(NEWID()))*DATEDIFF(dd,@pStartYear,@pEndYear)+@pStartYear
FROM sys.all_columns ac1
CROSS JOIN sys.all_columns ac2
)
SELECT RowNum = ISNULL(ROW_NUMBER() OVER (ORDER BY StartDT),0) --ISNULL makes this a NOT NULL column.
,StartDT = StartDT
,EndDT = RAND(CHECKSUM(NEWID()))
* CAST(DATEADD(hh,@pMaxHours-@pMinHours,0) AS DECIMAL(28,17))
+ DATEADD(hh,@pMinHours,StartDT)
INTO #TimeCardHistory
FROM cteStartDateTime
;
--=================================================================================================
-- Calculate the sum of the durations
--=================================================================================================
--===== Calculate the total duration in decimal hours.
-- This won't break if we exceed things like the max for ms or seconds
-- for a given row duration.
SELECT SUM(CAST(EndDT AS DECIMAL(28,17))-CAST(StartDT AS DECIMAL(28,17)))*24
FROM #TimeCardHistory
;
--=================================================================================================
-- Check the data to see if we meet the requirements
--=================================================================================================
--===== Check for min and max dates.
-- Min StartDT should be very close or equal the start of the year of @pStartYear.
-- Max EndDT should be very close to the end of the year on @pEndYear.
SELECT MinStartDT = MIN(StartDT)
,MaxStartDT = MAX(StartDT)
,MinEndDT = MIN(EndDT)
,MaxEndDT = MAX(EndDT)
FROM #TimeCardHistory
;
--===== Determine the number of samples per clock face hour.
-- They should be relatively flat due to the pseudo-randomness of NEWID()
-- from @pMinHours up to and not including @pMaxHours, which should be
-- fairly sparse because it's at the bitter end of the temporal range.
SELECT Hr = DATEDIFF(mi,StartDT,EndDT)/60, TheCount = COUNT(*)
FROM #TimeCardHistory
GROUP BY DATEDIFF(mi,StartDT,EndDT)/60
ORDER BY Hr
;
--Jeff Moden
Change is inevitable... Change for the better is not.
December 31, 2015 at 2:20 am
Jeff - thank you for taking the time to do a detailed analysis of my solution. Much appreciated! It is always good when something started out as a bit of fun turns into a valuable learning exercise. I look forward to playing with and understanding your solution.
Have a good New Year
December 31, 2015 at 4:24 am
Jeff Moden (12/30/2015)
--=================================================================================================
-- Calculate the sum of the durations
--=================================================================================================
--===== Calculate the total duration in decimal hours.
-- This won't break if we exceed things like the max for ms or seconds
-- for a given row duration.
SELECT SUM(CAST(EndDT AS DECIMAL(28,17))-CAST(StartDT AS DECIMAL(28,17)))*24
FROM #TimeCardHistory
;
Careful here Jeff, the type cast to decimal is expensive, consider using float which does not need a type cast is less costly.
😎
SELECT SUM(CAST(EndDT AS FLOAT)-CAST(StartDT AS FLOAT))*24
FROM #TimeCardHistory
;
Quick test
RAISERROR('--- DECIMAL -------------------------',0,0) WITH NOWAIT;
SET STATISTICS TIME ON;
SELECT SUM(CAST(EndDT AS DECIMAL(28,17))-CAST(StartDT AS DECIMAL(28,17)))*24
FROM #TimeCardHistory
;
SET STATISTICS TIME OFF;
RAISERROR('--- FLOAT -------------------------',0,0) WITH NOWAIT;
SET STATISTICS TIME ON;
SELECT SUM(CAST(EndDT AS FLOAT)-CAST(StartDT AS FLOAT))*24
FROM #TimeCardHistory
;
SET STATISTICS TIME OFF;
RAISERROR('--- DATEDIFF -------------------------',0,0) WITH NOWAIT;
SET STATISTICS TIME ON;
SELECT
SUM((0.0 + DATEDIFF(MILLISECOND,TCH.StartDT,TCH.EndDT)) / 3600000)
FROM #TimeCardHistory TCH;
SET STATISTICS TIME OFF;
Results
--- DECIMAL -------------------------
---------------------------------------
11001236.13306118989600
SQL Server Execution Times: CPU time = 1295 ms, elapsed time = 1308 ms.
--- FLOAT -------------------------
----------------------
11001236.1330615
SQL Server Execution Times: CPU time = 312 ms, elapsed time = 301 ms.
--- DATEDIFF -------------------------
---------------------------------------
11001236.124935183
SQL Server Execution Times: CPU time = 733 ms, elapsed time = 736 ms.
Edit: Added the actual query results. Interestingly the difference between DECIMAL and FLOAT is 0.001 second but between FLOAT and DATEDIFF (MILLISECOND) is 29.255 seconds.
Slightly more elaborate test
DECLARE @timer TABLE (T_TEXT VARCHAR(16) NOT NULL, T_TS DATETIME2(7) NOT NULL DEFAULT (SYSDATETIME()));
DECLARE @BIGINT_BUVKET BIGINT = 0;
DECLARE @DATE_BUCKET_01 DATETIME = 0;
DECLARE @DATE_BUCKET_02 DATETIME = 0;
DECLARE @FLOAT_BUCKET FLOAT = 0.0;
INSERT INTO @timer(T_TEXT) VALUES ('DRY RUN');
SELECT
@BIGINT_BUVKET = TCH.RowNum
,@DATE_BUCKET_01 = TCH.StartDT
,@DATE_BUCKET_02 = TCH.EndDT
FROM #TimeCardHistory TCH
INSERT INTO @timer(T_TEXT) VALUES ('DRY RUN');
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL 1');
SELECT @FLOAT_BUCKET = SUM(CAST(EndDT AS DECIMAL(28,17))-CAST(StartDT AS DECIMAL(28,17)))*24
FROM #TimeCardHistory
--OPTION (
-- RECOMPILE
-- ,QUERYTRACEON 3604
-- ,QUERYTRACEON 8607
-- )
;
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL 1');
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT 1');
SELECT @FLOAT_BUCKET = SUM(CAST(EndDT AS FLOAT)-CAST(StartDT AS FLOAT))*24
FROM #TimeCardHistory
--OPTION (
-- RECOMPILE
-- ,QUERYTRACEON 3604
-- ,QUERYTRACEON 8607
-- )
;
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT 1');
INSERT INTO @timer(T_TEXT) VALUES ('DATEDIFF 1');
SELECT
@FLOAT_BUCKET = SUM((0.0 + DATEDIFF(MILLISECOND,TCH.StartDT,TCH.EndDT)) / 3600000)
FROM #TimeCardHistory TCH
--OPTION (
-- RECOMPILE
-- ,QUERYTRACEON 3604
-- ,QUERYTRACEON 8607
-- )
;
INSERT INTO @timer(T_TEXT) VALUES ('DATEDIFF 1');
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL 2');
SELECT @FLOAT_BUCKET = SUM(CAST(EndDT AS DECIMAL(28,17))-CAST(StartDT AS DECIMAL(28,17)))*24
FROM #TimeCardHistory
;
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL 2');
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT 2');
SELECT @FLOAT_BUCKET = SUM(CAST(EndDT AS FLOAT)-CAST(StartDT AS FLOAT))*24
FROM #TimeCardHistory;
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT 2');
INSERT INTO @timer(T_TEXT) VALUES ('DATEDIFF 2');
SELECT
@FLOAT_BUCKET = SUM((0.0 + DATEDIFF(MILLISECOND,TCH.StartDT,TCH.EndDT)) / 3600000)
FROM #TimeCardHistory TCH;
INSERT INTO @timer(T_TEXT) VALUES ('DATEDIFF 2');
SELECT
T.T_TEXT
,DATEDIFF(MICROSECOND,MIN(T.T_TS),MAX(T.T_TS)) AS DURATION
FROM @timer T
GROUP BY T.T_TEXT
ORDER BY DURATION ASC;
Results
T_TEXT DURATION
---------------- -----------
DRY RUN 190011
FLOAT 1 289016
FLOAT 2 310018
DATEDIFF 2 681039
DATEDIFF 1 708041
DECIMAL 2 1243071
DECIMAL 1 1262072
December 31, 2015 at 7:46 am
Jeff Moden (12/30/2015)
Here's the code I wrote last night with the validation checks that I added today. Details are in the comments in the code.
GREAT! But suppose your company is going multi-national and they want to record dates using DATETIMEOFFSET. Update your code to use a DATETIMEOFFSET field with offsets ranging between -14:00 and +14:00.
Drew
J. Drew Allen
Business Intelligence Analyst
Philadelphia, PA
December 31, 2015 at 9:09 am
Eirikur Eiriksson (12/31/2015)
Slightly more elaborate test
DECLARE @timer TABLE (T_TEXT VARCHAR(16) NOT NULL, T_TS DATETIME2(7) NOT NULL DEFAULT (SYSDATETIME()));
DECLARE @BIGINT_BUVKET BIGINT = 0;
DECLARE @DATE_BUCKET_01 DATETIME = 0;
DECLARE @DATE_BUCKET_02 DATETIME = 0;
DECLARE @FLOAT_BUCKET FLOAT = 0.0;
INSERT INTO @timer(T_TEXT) VALUES ('DRY RUN');
SELECT
@BIGINT_BUVKET = TCH.RowNum
,@DATE_BUCKET_01 = TCH.StartDT
,@DATE_BUCKET_02 = TCH.EndDT
FROM #TimeCardHistory TCH
INSERT INTO @timer(T_TEXT) VALUES ('DRY RUN');
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL 1');
SELECT @FLOAT_BUCKET = SUM(CAST(EndDT AS DECIMAL(28,17))-CAST(StartDT AS DECIMAL(28,17)))*24
FROM #TimeCardHistory
--OPTION (
-- RECOMPILE
-- ,QUERYTRACEON 3604
-- ,QUERYTRACEON 8607
-- )
;
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL 1');
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT 1');
SELECT @FLOAT_BUCKET = SUM(CAST(EndDT AS FLOAT)-CAST(StartDT AS FLOAT))*24
FROM #TimeCardHistory
--OPTION (
-- RECOMPILE
-- ,QUERYTRACEON 3604
-- ,QUERYTRACEON 8607
-- )
;
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT 1');
INSERT INTO @timer(T_TEXT) VALUES ('DATEDIFF 1');
SELECT
@FLOAT_BUCKET = SUM((0.0 + DATEDIFF(MILLISECOND,TCH.StartDT,TCH.EndDT)) / 3600000)
FROM #TimeCardHistory TCH
--OPTION (
-- RECOMPILE
-- ,QUERYTRACEON 3604
-- ,QUERYTRACEON 8607
-- )
;
INSERT INTO @timer(T_TEXT) VALUES ('DATEDIFF 1');
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL 2');
SELECT @FLOAT_BUCKET = SUM(CAST(EndDT AS DECIMAL(28,17))-CAST(StartDT AS DECIMAL(28,17)))*24
FROM #TimeCardHistory
;
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL 2');
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT 2');
SELECT @FLOAT_BUCKET = SUM(CAST(EndDT AS FLOAT)-CAST(StartDT AS FLOAT))*24
FROM #TimeCardHistory;
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT 2');
INSERT INTO @timer(T_TEXT) VALUES ('DATEDIFF 2');
SELECT
@FLOAT_BUCKET = SUM((0.0 + DATEDIFF(MILLISECOND,TCH.StartDT,TCH.EndDT)) / 3600000)
FROM #TimeCardHistory TCH;
INSERT INTO @timer(T_TEXT) VALUES ('DATEDIFF 2');
SELECT
T.T_TEXT
,DATEDIFF(MICROSECOND,MIN(T.T_TS),MAX(T.T_TS)) AS DURATION
FROM @timer T
GROUP BY T.T_TEXT
ORDER BY DURATION ASC;
Results
T_TEXT DURATION
---------------- -----------
DRY RUN 190011
FLOAT 1 289016
FLOAT 2 310018
DATEDIFF 2 681039
DATEDIFF 1 708041
DECIMAL 2 1243071
DECIMAL 1 1262072
Eirikur, I've really come to enjoy your test harness. I particularly like how it uses the actual execution times instead of SET STATISTICS and I appreciate the simplicity of your timing table. Personally, I run performance comparisons using a number of different approaches, but I seem to write them from scratch every single time. This is something I've wanted to standardize for quite some time, but I haven't quite gotten around to doing yet. Looking at it today (without any work to do ;-)), makes me see it in a different light. Your test harness inspires me to get it done. A sincere thank you for the kick I needed.
December 31, 2015 at 10:53 am
Ed Wagner (12/31/2015)
Eirikur, I've really come to enjoy your test harness. I particularly like how it uses the actual execution times instead of SET STATISTICS and I appreciate the simplicity of your timing table. Personally, I run performance comparisons using a number of different approaches, but I seem to write them from scratch every single time. This is something I've wanted to standardize for quite some time, but I haven't quite gotten around to doing yet. Looking at it today (without any work to do ;-)), makes me see it in a different light. Your test harness inspires me to get it done. A sincere thank you for the kick I needed.
You are very welcome to it Ed, scavenge what ever you find usable there 😀
I like the method of using a table variable as I've found it being the one least affected and least affecting the actual test runs. Table variables do not carry the same locking overhead as temporary or normal table and are in fact implemented with the NOLOCK hint behind the scene. Makes them more suitable for this purpose. I do have few analytical models in both Power BI and Power Pivot if you are interested, how about a little collaboration on this (you ask me for what you want and I'll give it to you 😉 )
😎
Another thing that is very useful are the query trace flags, enables one to see the actual execution plan or the optimizer's Output Tree without having to wait for the execution's completion, nifty thing when analyzing queries that do not finish. Mind you, using the trace flags does affect the execution time.
OPTION (
RECOMPILE
,QUERYTRACEON 3604
,QUERYTRACEON 8607
)
January 1, 2016 at 4:40 pm
drew.allen (12/31/2015)
Jeff Moden (12/30/2015)
Here's the code I wrote last night with the validation checks that I added today. Details are in the comments in the code.GREAT! But suppose your company is going multi-national and they want to record dates using DATETIMEOFFSET. Update your code to use a DATETIMEOFFSET field with offsets ranging between -14:00 and +14:00.
Drew
Drew, can you elaborate a little bit more on this, does every entry start in the same timezone as it ends in? Normally one would do UTC correlation on the entry to the database, makes things much simpler.
😎
January 1, 2016 at 5:09 pm
Eirikur Eiriksson (12/31/2015)
Careful here Jeff, the type cast to decimal is expensive, consider using float which does not need a type cast is less costly.
Understood. FLOAT only has a precision of 15 a milliseconds require that precision when you consider what fraction of a day they are. Are you saying that the SUM of FLOATs would somehow overcome that little problem in the final answer? {Edit} Never mind... I see that you are. Thanks for the code. I'll check it out.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 1, 2016 at 5:13 pm
drew.allen (12/31/2015)
Jeff Moden (12/30/2015)
Here's the code I wrote last night with the validation checks that I added today. Details are in the comments in the code.GREAT! But suppose your company is going multi-national and they want to record dates using DATETIMEOFFSET. Update your code to use a DATETIMEOFFSET field with offsets ranging between -14:00 and +14:00.
Drew
Ostensibly, the StartDT and EndDT would be represented using the same DATETIMEOFFSET, which also does NOT change the fact that there are 24 hours in a day. I don't see the problem there.
Shifting gears a bit, I've not seen a lick of code on this problem from you. Rather than continuing to flip zingers, would you provide some demonstrable code to make your points so that we can actually test them? Thanks.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 2, 2016 at 7:24 am
Jeff Moden (1/1/2016)
Eirikur Eiriksson (12/31/2015)
Careful here Jeff, the type cast to decimal is expensive, consider using float which does not need a type cast is less costly.Understood. FLOAT only has a precision of 15 a milliseconds require that precision when you consider what fraction of a day they are.
I've found that compared to the date and time functions, float has adequate precision as the dt functions only seem to use the precision of 6. Generating a datetime data type test set using milliseconds will result in rounding difference which makes it harder to compare to other datatypes and methods, change that to seconds and it will match the others.
😎
Quick test set / harness
--USE TEST;
--GO
SET NOCOUNT ON;
--/* -- UNCOMMENT THIS LINE TO SKIP THE TEST SET GENERATION
IF OBJECT_ID(N'dbo.TBL_TIME_CARD') IS NOT NULL DROP TABLE dbo.TBL_TIME_CARD;
CREATE TABLE dbo.TBL_TIME_CARD
(
TIME_CARD_ID INT IDENTITY(1,1) NOT NULL CONSTRAINT PK_DBO_TBL_TIME_CARD_TIME_CARD_ID PRIMARY KEY CLUSTERED
,DAY_OFFSET INT NOT NULL
,START_TIME INT NOT NULL
,UTC_OFFSET_SINT NOT NULL
,DURATION_SECINT NOT NULL
,START_DATE_DTDATETIME NOT NULL
,END_DATE_DT DATETIME NOT NULL
,START_DATE_DT2DATETIME2(7) NOT NULL
,END_DATE_DT2 DATETIME2(7) NOT NULL
);
-- CONTROL VARIABLES
DECLARE @SAMPLE_SIZE INT = 1000000; -- 60MB DATA
DECLARE @CREATE_INDICES INT = 0; -- SET TO 1 TO CREATE INDECES, ADDS 200MB FOR 1000000 ROWS
DECLARE @MIN_DURATION_HOURS BIGINT = 4;
DECLARE @MAX_DURATION_HOURS BIGINT = 18;
DECLARE @MIN_DURATION BIGINT = @MIN_DURATION_HOURS * 3600;
DECLARE @MAX_DURATION BIGINT = @MAX_DURATION_HOURS * 3600;
DECLARE @MIN_DATE DATETIME = CONVERT(DATETIME,'2000-01-01',120);
DECLARE @MAX_DATE DATETIME = CONVERT(DATETIME,'2016-01-01',120);
DECLARE @DATE_RANGE INT = DATEDIFF(DAY,@MIN_DATE,@MAX_DATE);
DECLARE @DAY_SEC INT = 86400;
DECLARE @TZ_COUNT INT = 24;
DECLARE @TZ_DURATION_S INT = 3600;
-- INLINE TALLY TABLE 20^8, 25,600,000,000
;WITH T(N) AS (SELECT N FROM (VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0),
(0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) AS X(N))
, NUMS(N) AS (SELECT TOP (@SAMPLE_SIZE)ROW_NUMBER() OVER(ORDER BY (SELECT NULL)) AS N
FROM T T1,T T2,T T3,T T4,T T5,T T6,T T7,T T8)
,BASE_DATA AS
(
SELECT
( ABS(CHECKSUM(NEWID())) % @DATE_RANGE) AS DAY_OFFSET
,( ABS(CHECKSUM(NEWID())) % @DAY_SEC) AS START_TIME
,((ABS(CHECKSUM(NEWID())) % (@MAX_DURATION - @MIN_DURATION)) + @MIN_DURATION) AS DURATION_SEC
,( CHECKSUM(NEWID()) % (@TZ_COUNT / 2)) * @TZ_DURATION_S AS UTC_OFFSET_S
FROM NUMS NM
)
INSERT INTO dbo.TBL_TIME_CARD
(
DAY_OFFSET
,START_TIME
,UTC_OFFSET_S
,DURATION_SEC
,START_DATE_DT
,END_DATE_DT
,START_DATE_DT2
,END_DATE_DT2
)
SELECT
BD.DAY_OFFSET
,BD.START_TIME
,BD.UTC_OFFSET_S
,BD.DURATION_SEC
,DATEADD(SECOND,BD.START_TIME
,DATEADD(DAY,BD.DAY_OFFSET,@MIN_DATE)) AS START_DATE_DT
,DATEADD(SECOND,BD.DURATION_SEC
,DATEADD(SECOND,BD.START_TIME
,DATEADD(DAY,BD.DAY_OFFSET,@MIN_DATE))) AS END_DATE_DT
,DATEADD(SECOND,BD.START_TIME
,DATEADD(DAY,BD.DAY_OFFSET,CONVERT(DATETIME2(7),@MIN_DATE,0))) AS START_DATE_DT2
,DATEADD(SECOND,BD.DURATION_SEC
,DATEADD(SECOND,BD.START_TIME
,DATEADD(DAY,BD.DAY_OFFSET,CONVERT(DATETIME2(7),@MIN_DATE,0)))) AS END_DATE_DT2
FROM BASE_DATA BD
ORDER BY BD.DAY_OFFSET
,BD.START_TIME;
IF (@CREATE_INDICES = 1)
BEGIN
-- DATETIME INDICES
CREATE NONCLUSTERED INDEX NCLIDX_DBO_TBL_TIME_CARD_START_DATE_DT_INCL_END_DATE_DT ON dbo.TBL_TIME_CARD ( START_DATE_DT ASC ) INCLUDE ( END_DATE_DT );
CREATE NONCLUSTERED INDEX NCLIDX_DBO_TBL_TIME_CARD_END_DATE_DT_INCL_START_DATE_DT ON dbo.TBL_TIME_CARD ( END_DATE_DT ASC ) INCLUDE ( START_DATE_DT );
CREATE NONCLUSTERED INDEX NCLIDX_DBO_TBL_TIME_CARD_DURATION_SEC_INCL_START_DATE_DT ON dbo.TBL_TIME_CARD ( DURATION_SEC ASC ) INCLUDE ( START_DATE_DT );
CREATE NONCLUSTERED INDEX NCLIDX_DBO_TBL_TIME_CARD_START_DATE_DT_END_DATE_DT_INCL_DURATION_SEC ON dbo.TBL_TIME_CARD ( START_DATE_DT ASC , END_DATE_DT ASC ) INCLUDE ( DURATION_SEC );
-- DATETIME2(7) INDICES
CREATE NONCLUSTERED INDEX NCLIDX_DBO_TBL_TIME_CARD_START_DATE_DT2_INCL_END_DATE_DT2 ON dbo.TBL_TIME_CARD ( START_DATE_DT2 ASC ) INCLUDE ( END_DATE_DT2 );
CREATE NONCLUSTERED INDEX NCLIDX_DBO_TBL_TIME_CARD_END_DATE_DT2_INCL_START_DATE_DT2 ON dbo.TBL_TIME_CARD ( END_DATE_DT2 ASC ) INCLUDE ( START_DATE_DT2 );
CREATE NONCLUSTERED INDEX NCLIDX_DBO_TBL_TIME_CARD_DURATION_SEC_INCL_START_DATE_DT2 ON dbo.TBL_TIME_CARD ( DURATION_SEC ASC ) INCLUDE ( START_DATE_DT2 );
CREATE NONCLUSTERED INDEX NCLIDX_DBO_TBL_TIME_CARD_START_DATE_DT2_END_DATE_DT2_INCL_DURATION_SEC ON dbo.TBL_TIME_CARD ( START_DATE_DT2 ASC , END_DATE_DT2 ASC ) INCLUDE ( DURATION_SEC );
END
-- */
SELECT
MIN(TC.START_DATE_DT ) AS MinStartDT_DT
,MAX(TC.START_DATE_DT ) AS MaxStartDT_DT
,MIN(TC.END_DATE_DT ) AS MinEndDT_DT
,MAX(TC.END_DATE_DT ) AS MaxEndDT_DT
,MIN(TC.START_DATE_DT2 ) AS MinStartDT_DT2
,MAX(TC.START_DATE_DT2 ) AS MaxStartDT_DT2
,MIN(TC.END_DATE_DT2 ) AS MinEndDT_DT2
,MAX(TC.END_DATE_DT2 ) AS MaxEndDT_DT2
FROM dbo.TBL_TIME_CARD TC;
SELECT
(TC.DURATION_SEC / 3600) AS DURATION_FROM
,(TC.DURATION_SEC / 3600) + 1 AS DURATION_TO
,COUNT(TC.TIME_CARD_ID) AS INST_COUNT
FROM dbo.TBL_TIME_CARD TC
GROUP BY (TC.DURATION_SEC / 3600)
ORDER BY (TC.DURATION_SEC / 3600);
;WITH BASE_DATA AS
(
SELECT
SUM(CONVERT(DECIMAL(28,17),TC.DURATION_SEC,0)) AS SUM_DURATION
,SUM(0.0 + DATEDIFF(SECOND,TC.START_DATE_DT,TC.END_DATE_DT)) AS SUM_DATEDIFF_DT
,SUM(0.0 + DATEDIFF(SECOND,TC.START_DATE_DT2,TC.END_DATE_DT2)) AS SUM_DATEDIFF_DT2
,SUM(CONVERT(DECIMAL(28,17),TC.END_DATE_DT,0)-CONVERT(DECIMAL(28,17),TC.START_DATE_DT,0)) AS DECIMAL_TICKS
,SUM(CONVERT(FLOAT,TC.END_DATE_DT,0)-CONVERT(FLOAT,TC.START_DATE_DT,0)) AS FLOAT_TICKS
FROM dbo.TBL_TIME_CARD TC
)
SELECT
BD.SUM_DURATION
,BD.SUM_DATEDIFF_DT
,BD.SUM_DATEDIFF_DT2
,BD.DECIMAL_TICKS
,BD.FLOAT_TICKS
,(BD.SUM_DURATION / 3600 ) AS DURATION_HOURS
,(BD.SUM_DATEDIFF_DT / 3600 ) AS DATEDIFF_HOURS_DT
,(BD.SUM_DATEDIFF_DT2 / 3600 ) AS DATEDIFF_HOURS_DT2
,(BD.DECIMAL_TICKS * 24) AS DECIMAL_HOURS
,(BD.FLOAT_TICKS * 24) AS FLOAT_HOURS
FROM BASE_DATA BD;
Output
MinStartDT_DT MaxStartDT_DT MinEndDT_DT MaxEndDT_DT MinStartDT_DT2 MaxStartDT_DT2 MinEndDT_DT2 MaxEndDT_DT2
----------------------- ----------------------- ----------------------- ----------------------- --------------------------- --------------------------- --------------------------- ---------------------------
2000-01-01 00:01:53.000 2015-12-31 23:56:18.000 2000-01-01 05:13:32.000 2016-01-01 16:22:58.000 2000-01-01 00:01:53.0000000 2015-12-31 23:56:18.0000000 2000-01-01 05:13:32.0000000 2016-01-01 16:22:58.0000000
DURATION_FROM DURATION_TO INST_COUNT
------------- ----------- -----------
4 5 71323
5 6 71856
6 7 71544
7 8 71502
8 9 71297
9 10 71256
10 11 71398
11 12 71147
12 13 71482
13 14 71438
14 15 71444
15 16 71505
16 17 71395
17 18 71413
SUM_DURATION SUM_DATEDIFF_DT SUM_DATEDIFF_DT2 DECIMAL_TICKS FLOAT_TICKS DURATION_HOURS DATEDIFF_HOURS_DT DATEDIFF_HOURS_DT2 DECIMAL_HOURS FLOAT_HOURS
--------------------------------------- --------------------------------------- --------------------------------------- --------------------------------------- ---------------------- --------------------------------------- --------------------------------------- --------------------------------------- --------------------------------------- ----------------------
39591875172.00000000000000000 39591875172.0 39591875172.0 458239.29597222235200000 458239.295972207 10997743.10333333333333333 10997743.103333 10997743.103333 10997743.10333333644800 10997743.103333
January 3, 2016 at 8:01 am
Timing the different methods using the previously posted test data set
😎
Test harness
-- TEST: Different ways of calculating duration
-- DATA SET: http://www.sqlservercentral.com/Forums/FindPost1749100.aspx
DECLARE @timer TABLE (T_TEXT VARCHAR(100) NOT NULL, T_TS DATETIME2(7) NOT NULL DEFAULT (SYSDATETIME()));
DECLARE @MT_ID INT = 0;
DECLARE @INT_BUCKET_01 INT = 0;
DECLARE @INT_BUCKET_02 INT = 0;
DECLARE @INT_BUCKET_03 INT = 0;
DECLARE @INT_BUCKET_04 INT = 0;
DECLARE @INT_BUCKET_05 INT = 0;
DECLARE @DATETIME_BUCKET_01 DATETIME = 0;
DECLARE @DATETIME_BUCKET_02 DATETIME = 0;
DECLARE @DATETIME2_7_BUCKET_01 DATETIME2(7) = CONVERT( DATETIME2(7), '2016-01-01',120);
DECLARE @DATETIME2_7_BUCKET_02 DATETIME2(7) = CONVERT( DATETIME2(7), '2016-01-01',120);
DECLARE @BIGINT_BUCKET_01 BIGINT = 0;
DECLARE @FLOAT_BUCKET FLOAT = 0.0;
--/*
INSERT INTO @timer(T_TEXT) VALUES ('DRY RUN');
SELECT
@INT_BUCKET_01 = TC.TIME_CARD_ID
,@INT_BUCKET_02 = TC.DAY_OFFSET
,@INT_BUCKET_03 = TC.START_TIME
,@INT_BUCKET_04 = TC.UTC_OFFSET_S
,@INT_BUCKET_05 = TC.DURATION_SEC
,@DATETIME_BUCKET_01 = TC.START_DATE_DT
,@DATETIME_BUCKET_02 = TC.END_DATE_DT
,@DATETIME2_7_BUCKET_01 = TC.START_DATE_DT2
,@DATETIME2_7_BUCKET_02 = TC.END_DATE_DT2
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('DRY RUN');
-- */
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DURATION 1');
SELECT
@FLOAT_BUCKET = SUM(CONVERT(DECIMAL(28,17),TC.DURATION_SEC,0)) / 3600
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DURATION 1');
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DURATION 2');
SELECT
@FLOAT_BUCKET = SUM(CONVERT(DECIMAL(28,17),TC.DURATION_SEC,0)) / 3600
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DURATION 2');
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DURATION 3');
SELECT
@FLOAT_BUCKET = SUM(CONVERT(DECIMAL(28,17),TC.DURATION_SEC,0)) / 3600
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DURATION 3');
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DATEDIFF_DT 1');
SELECT
@FLOAT_BUCKET = SUM(0.0 + DATEDIFF(SECOND,TC.START_DATE_DT,TC.END_DATE_DT)) / 3600
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DATEDIFF_DT 1');
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DATEDIFF_DT 2');
SELECT
@FLOAT_BUCKET = SUM(0.0 + DATEDIFF(SECOND,TC.START_DATE_DT,TC.END_DATE_DT)) / 3600
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DATEDIFF_DT 2');
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DATEDIFF_DT 3');
SELECT
@FLOAT_BUCKET = SUM(0.0 + DATEDIFF(SECOND,TC.START_DATE_DT,TC.END_DATE_DT)) / 3600
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DATEDIFF_DT 3');
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DATEDIFF_DT2 1');
SELECT
@FLOAT_BUCKET = SUM(0.0 + DATEDIFF(SECOND,TC.START_DATE_DT2,TC.END_DATE_DT2)) / 3600
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DATEDIFF_DT2 1');
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DATEDIFF_DT2 2');
SELECT
@FLOAT_BUCKET = SUM(0.0 + DATEDIFF(SECOND,TC.START_DATE_DT2,TC.END_DATE_DT2)) / 3600
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DATEDIFF_DT2 2');
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DATEDIFF_DT2 3');
SELECT
@FLOAT_BUCKET = SUM(0.0 + DATEDIFF(SECOND,TC.START_DATE_DT2,TC.END_DATE_DT2)) / 3600
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('SUM_DATEDIFF_DT2 3');
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL_TRICK 1');
SELECT
@FLOAT_BUCKET = SUM((CONVERT(DECIMAL(28,17),TC.END_DATE_DT,0)-CONVERT(DECIMAL(28,17),TC.START_DATE_DT,0)) * 24 )
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL_TRICK 1');
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL_TRICK 2');
SELECT
@FLOAT_BUCKET = SUM((CONVERT(DECIMAL(28,17),TC.END_DATE_DT,0)-CONVERT(DECIMAL(28,17),TC.START_DATE_DT,0)) * 24 )
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL_TRICK 2');
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL_TRICK 3');
SELECT
@FLOAT_BUCKET = SUM((CONVERT(DECIMAL(28,17),TC.END_DATE_DT,0)-CONVERT(DECIMAL(28,17),TC.START_DATE_DT,0)) * 24 )
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('DECIMAL_TRICK 3');
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT_TRICK 1');
SELECT
@FLOAT_BUCKET = SUM((CONVERT(FLOAT,TC.END_DATE_DT,0)-CONVERT(FLOAT,TC.START_DATE_DT,0)) * 24 )
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT_TRICK 1');
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT_TRICK 2');
SELECT
@FLOAT_BUCKET = SUM((CONVERT(FLOAT,TC.END_DATE_DT,0)-CONVERT(FLOAT,TC.START_DATE_DT,0)) * 24 )
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT_TRICK 2');
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT_TRICK 3');
SELECT
@FLOAT_BUCKET = SUM((CONVERT(FLOAT,TC.END_DATE_DT,0)-CONVERT(FLOAT,TC.START_DATE_DT,0)) * 24 )
FROM dbo.TBL_TIME_CARD TC;
INSERT INTO @timer(T_TEXT) VALUES ('FLOAT_TRICK 3');
-- Test results
SELECT
T.T_TEXT
,DATEDIFF(MICROSECOND,MIN(T.T_TS),MAX(T.T_TS)) AS DURATION
FROM @timer T
GROUP BY T.T_TEXT
ORDER BY DURATION ASC;
-- Verifying the calculations
SELECT
'ACTUAL' AS METHOD
,SUM(CONVERT(DECIMAL(28,17),TC.DURATION_SEC,0)) / 3600 AS VALUE
FROM dbo.TBL_TIME_CARD TC
UNION ALL
SELECT
'DATETIME'
,SUM(0.0 + DATEDIFF(SECOND,TC.START_DATE_DT,TC.END_DATE_DT)) / 3600
FROM dbo.TBL_TIME_CARD TC
UNION ALL
SELECT
'DATETIME2'
,SUM(0.0 + DATEDIFF(SECOND,TC.START_DATE_DT2,TC.END_DATE_DT2)) / 3600
FROM dbo.TBL_TIME_CARD TC
UNION ALL
SELECT
'DECIMAL'
,SUM((CONVERT(DECIMAL(28,17),TC.END_DATE_DT,0)-CONVERT(DECIMAL(28,17),TC.START_DATE_DT,0)) * 24 )
FROM dbo.TBL_TIME_CARD TC
UNION ALL
SELECT
'FLOAT'
,SUM((CONVERT(FLOAT,TC.END_DATE_DT,0)-CONVERT(FLOAT,TC.START_DATE_DT,0)) * 24)
FROM dbo.TBL_TIME_CARD TC;
Timing results
T_TEXT DURATION
-------------------- -----------
FLOAT_TRICK 3 308017
FLOAT_TRICK 2 315018
FLOAT_TRICK 1 316018
DRY RUN 331019
SUM_DURATION 1 424024
SUM_DURATION 3 425024
SUM_DURATION 2 431025
SUM_DATEDIFF_DT 3 543031
SUM_DATEDIFF_DT 2 595034
SUM_DATEDIFF_DT 1 626036
SUM_DATEDIFF_DT2 1 648037
SUM_DATEDIFF_DT2 2 649037
SUM_DATEDIFF_DT2 3 771044
DECIMAL_TRICK 3 1375079
DECIMAL_TRICK 1 1392080
DECIMAL_TRICK 2 1433082
Method verification
METHOD VALUE
--------- ----------------------
ACTUAL 10997743.1033333
DATETIME 10997743.103333
DATETIME2 10997743.103333
DECIMAL 10997743.1033333
FLOAT 10997743.1033331
January 4, 2016 at 7:56 am
Jeff Moden (1/1/2016)
Ostensibly, the StartDT and EndDT would be represented using the same DATETIMEOFFSET, which also does NOT change the fact that there are 24 hours in a day. I don't see the problem there.
Your code assigns a decimal value to a DATETIME field and uses an implicit conversion from decimal to datetime. Later on you CAST a DATETIME field to DECIMAL. If you change your DATETIME field to a DATETIMEOFFSET field you get the following errors: "Operand type clash: decimal is incompatible with datetimeoffset" for the assignment and "Explicit conversion from data type datetimeoffset to decimal is not allowed." for the explicit CAST. You don't see a problem with that?
Shifting gears a bit, I've not seen a lick of code on this problem from you. Rather than continuing to flip zingers, would you provide some demonstrable code to make your points so that we can actually test them? Thanks.
My point is that the ability to do decimal arithmetic on DATETIME and SMALLDATETIME data is an artifact of how those data types are implemented and not reflective of a meaningful operation on date/time data. You're point is that decimal arithmetic is more efficient than date/time arithmetic. I see no value in providing code to help you make your point when I have made my point without providing code.
Drew
[edited: changed "explicit" to "implicit"]
J. Drew Allen
Business Intelligence Analyst
Philadelphia, PA
January 8, 2016 at 7:49 pm
drew.allen (1/4/2016)
Jeff Moden (1/1/2016)
Ostensibly, the StartDT and EndDT would be represented using the same DATETIMEOFFSET, which also does NOT change the fact that there are 24 hours in a day. I don't see the problem there.Your code assigns a decimal value to a DATETIME field and uses an implicit conversion from decimal to datetime. Later on you CAST a DATETIME field to DECIMAL. If you change your DATETIME field to a DATETIMEOFFSET field you get the following errors: "Operand type clash: decimal is incompatible with datetimeoffset" for the assignment and "Explicit conversion from data type datetimeoffset to decimal is not allowed." for the explicit CAST. You don't see a problem with that?
Shifting gears a bit, I've not seen a lick of code on this problem from you. Rather than continuing to flip zingers, would you provide some demonstrable code to make your points so that we can actually test them? Thanks.
My point is that the ability to do decimal arithmetic on DATETIME and SMALLDATETIME data is an artifact of how those data types are implemented and not reflective of a meaningful operation on date/time data. You're point is that decimal arithmetic is more efficient than date/time arithmetic. I see no value in providing code to help you make your point when I have made my point without providing code.
Drew
[edited: changed "explicit" to "implicit"]
Heh... of course I see a problem with that. It's the same reason why I avoid the other "newer" types of temporal datatypes. If you convert the inconvertible to datetime and then do what you need, there will be no problem.
Now... I've asked you several times... please post some code that would solve the problem at the millisecond level using the datatypes that you've been speaking of. And, yes, it can be done.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 16 through 30 (of 38 total)
You must be logged in to reply to this topic. Login to reply