December 28, 2015 at 8:19 pm
Hi,
I need to set run time(dynamic) values for Startdate and Enddate in below query
Enddate should be currentdate and Startdate should be minus 7 days of Enddate.
Query:
SELECT * FROM tablename(nolock) where
startdate >= '2015-12-22'
and enddate <= '2015-12-28'
can someone provide complete query.
Thanks.
December 28, 2015 at 8:24 pm
here's a hint:
GETDATE() returns the current date.
December 28, 2015 at 8:38 pm
is it correct ?
SELECT * FROM tablename(nolock) where
startdate >= Getdate() -6
and enddate <= Getdate()
December 28, 2015 at 9:12 pm
NOLOCK is a no-no... don't do that unless you're aware of the nasty side effects.
December 28, 2015 at 9:54 pm
charipg (12/28/2015)
is it correct ?SELECT * FROM tablename(nolock) where
startdate >= Getdate() -6
and enddate <= Getdate()
Maybe... it depends on what you define as "days ago". If you mean equal increments of exactly 24 hours per day back then, with the exception of the WITH(NOLOCK), yes. If you mean "days ago" as in calendar days that always start at midnight, then no. You'll need to convert GETDATE() to today's date with a midnight time before subtracting days. And if you really want a week, you should subtract 7 days unless you include today as a day, in which case subtracting 6 days is correct... well... maybe except when holidays or weekends or blue-moon-Tuesdays are involved. 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
December 28, 2015 at 10:00 pm
charipg (12/28/2015)
is it correct ?SELECT * FROM tablename(nolock) where
startdate >= Getdate() -6
and enddate <= Getdate()
The GETDATE() returns a datetime, on which you can do integer math. If your startdate and enddate columns are datetime data types, that'll work without an implicit cast. Also, remember that the GETDATE() includes the time, do if you want to look at 6 days ago as the exact time on that date, then you're good. If not, you'll have to take the start of date today as the base instead of including the time.
Lynn Pettis has a post with common date functions at http://www.sqlservercentral.com/blogs/lynnpettis/2009/03/25/some-common-date-routines/. The DATEADD function is covered at https://msdn.microsoft.com/en-us/library/ms186819%28v=sql.110%29.aspx.
December 29, 2015 at 12:22 am
charipg (12/28/2015)
Hi,I need to set run time(dynamic) values for Startdate and Enddate in below query
Enddate should be currentdate and Startdate should be minus 7 days of Enddate.
Query:
SELECT * FROM tablename(nolock) where
startdate >= '2015-12-22'
and enddate <= '2015-12-28'
can someone provide complete query.
Thanks.
This should get you started
😎
DECLARE @CURR_DATE DATE = CONVERT(DATE,GETDATE(),0);
DECLARE @D_MINUS_6 DATE = DATEADD(DAY,-6,@CURR_DATE);
/*
Pseudo query which brings back the instances where
startdate and enddate both fall within the period
*/
SELECT
[COLUMN_NAME]
FROM [TABLE_NAME]
WHERE [START_DATE] >= @D_MINUS_6
AND [END_DATE] <= @CURR_DATE;
/*
Pseudo query which brings back the instances where
startdate and enddate overlap the period
*/
SELECT
[COLUMN_NAME]
FROM [TABLE_NAME]
WHERE [START_DATE] <= @CURR_DATE
AND [END_DATE] >= @D_MINUS_6;
December 29, 2015 at 11:28 am
Ed Wagner (12/28/2015)
charipg (12/28/2015)
is it correct ?SELECT * FROM tablename(nolock) where
startdate >= Getdate() -6
and enddate <= Getdate()
The GETDATE() returns a datetime, on which you can do integer math.
Just because you CAN do something, doesn't mean that it's a good idea. DATE/TIME math should be done with the functions specifically designed for that: DATEDIFF and DATEADD.
The DATETIME and SMALLDATETIME data types do support decimal (not just integer) math, but it's probably only for backwards compatibility. None of the newer date/time data types support decimal math. I would not be surprised if at some point in the not-too-distant future, MS stopped supporting decimal math for the DATETIME and SMALLDATETIME data types.
Drew
J. Drew Allen
Business Intelligence Analyst
Philadelphia, PA
December 29, 2015 at 12:53 pm
drew.allen (12/29/2015)
None of the newer date/time data types support decimal math.
I consider that to be a major failure on the part of Microsoft. I'll never understand why "improvements" are frequently accompanied by total loss of really useful simple-to-use features.
I would not be surprised if at some point in the not-too-distant future, MS stopped supporting decimal math for the DATETIME and SMALLDATETIME data types.
If they ever do, I'll wear my best postal uniform on the trip to Redmond.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 29, 2015 at 2:47 pm
Jeff Moden (12/29/2015)
drew.allen (12/29/2015)
None of the newer date/time data types support decimal math.I consider that to be a major failure on the part of Microsoft. I'll never understand why "improvements" are frequently accompanied by total loss of really useful simple-to-use features.
I consider it a bug that decimal math works with DATETIME and SMALLDATETIME. The fact that you can perform decimal math on a DATETIME or SMALLDATETIME field is a side effect of how it is stored rather than being reflective of something meaningful for date/time data. In some future version of SQL Server, MS could decide to change how date/time data is stored, and decimal math would cease to work.
Drew
J. Drew Allen
Business Intelligence Analyst
Philadelphia, PA
December 29, 2015 at 8:40 pm
drew.allen (12/29/2015)
Jeff Moden (12/29/2015)
drew.allen (12/29/2015)
None of the newer date/time data types support decimal math.I consider that to be a major failure on the part of Microsoft. I'll never understand why "improvements" are frequently accompanied by total loss of really useful simple-to-use features.
I consider it a bug that decimal math works with DATETIME and SMALLDATETIME. The fact that you can perform decimal math on a DATETIME or SMALLDATETIME field is a side effect of how it is stored rather than being reflective of something meaningful for date/time data. In some future version of SQL Server, MS could decide to change how date/time data is stored, and decimal math would cease to work.
Drew
I'm not surprised at that response. A whole lot of people feel the same way.
Let's have some fun. You use the non-decimal temporal math methods you speak of and and I'll use my methods to solve the same problem and then we'll compare. There are actually two quick little problems. Admittedly, this is a mostly unrealistic scenario because most time card machines don't record with a resolution of 1/300th of a second but I figure that'll make it interesting.
1. Create a million row table of simulated time card information called "TimeCardHistory". Each row should be randomly generated. A column called "StartDT" must always have a value with a date and time with a resolution of at least 1/300th of a second rounded to the nearest millisecond with a value that can occur anytime during the years 2000 through 2015. A column called "EndDT" must also be randomly generated to have a date and time of at least 4 hours later and up to (including or not including... your choice) 18 hours later than the StartDT on the same row.
2. Using the newly formed "TimeCardHistory" table, calculate the sum of all simple durations created by the StartDT and the EndDT for the entire table and display the result as decimal hours.
And, no... I'm not being snarky here. I'd really like to see how others would do this using only the methods you speak of. I'll post mine right after you post yours.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 30, 2015 at 4:25 am
Thought I would have a go at your challenge Jeff, just for fun as it is quiet in work.
Looking forward to any critiques - not sure if I went the correct way with randomising the dates.
I didn't randomise milliseconds.
Create Table #TimeCardHistory
(#TimeIdint identity(1,1) not null
,StartDtdatetime not null
,EndDtdatetime not null
);
with cte as (
select 0 as Number union all
select 0 union all
select 0 union all
select 0 union all
select 0 union all
select 0 union all
select 0 union all
select 0 union all
select 0 union all
select 0),
cte1 as (
select 0 as number from cte c1
cross join cte c2
cross join cte c3
cross join cte c4
cross join cte c5
cross join cte c6)
insert into #TimeCardHistory(StartDt, EndDt)
selectdateadd(day,-RandDay,dateadd(SECOND,-RandSecond,GETDATE())) as StartDt
,dateadd(day,-RandDay,dateadd(second,-RandSecond,dateadd(hour,iif(RandEndHour <4,4,RandEndHour),dateadd(minute, RandEndMinute,dateadd(second,RandEndSecond,GETDATE()))))) As EndDt
from cte1
cross join (select ABS(CHECKSUM(NEWID()) % 5475) as RandDay
,ABS(CHECKSUM(NEWID()) % 86400) as RandSecond
,ABS(CHECKSUM(NEWID()) % 18) as RandEndHour
,ABS(CHECKSUM(NEWID()) % 60) as RandEndMinute
,ABS(CHECKSUM(NEWID()) % 60) as RandEndSecond) Rnd;
selectSum(DATEDIFF(ms,startDt, EndDt) / 3600000.0) as TotalHours
from#TimeCardHistory;
December 30, 2015 at 9:08 am
Jeff Moden (12/29/2015)
And, no... I'm not being snarky here. I'd really like to see how others would do this using only the methods you speak of. I'll post mine right after you post yours.
Aren't you? If you try running DATEDIFF(ms, '2000-01-01', '2015-12-31 23:59:59.997') you'll get an overflow error, which means that it's impossible to create random data to your specifications using nothing but a single DATEADD() function. Changing the specifications to the nearest second would allow that approach, while still producing representative data.
Also, a comparison of the efficiency of the different approaches is irrelevant to my point that decimal math on dates is an artifact of how it is stored. We've already seen people posting that they're getting errors when they try to use decimal math on DATE fields, because they've been using it inappropriately on DATETIME fields
Drew
J. Drew Allen
Business Intelligence Analyst
Philadelphia, PA
December 30, 2015 at 12:51 pm
drew.allen (12/30/2015)
Jeff Moden (12/29/2015)
And, no... I'm not being snarky here. I'd really like to see how others would do this using only the methods you speak of. I'll post mine right after you post yours.Aren't you? If you try running DATEDIFF(ms, '2000-01-01', '2015-12-31 23:59:59.997') you'll get an overflow error, which means that it's impossible to create random data to your specifications using nothing but a single DATEADD() function. Changing the specifications to the nearest second would allow that approach, while still producing representative data.
Also, a comparison of the efficiency of the different approaches is irrelevant to my point that decimal math on dates is an artifact of how it is stored. We've already seen people posting that they're getting errors when they try to use decimal math on DATE fields, because they've been using it inappropriately on DATETIME fields
Drew
Is that being snarky or just exhibiting the truth of the problem? Generating data to the ms across a large range of dates is very important to testing.
And based on your comment, SELECT must be horribly broken because people are getting errors with it. 😉 This forum is littered with such errors. Rather than just throwing out a comment, do you have an example of the errors that people are supposedly getting? Could it be they just don't know what it is that they're doing like some of those people that get errors with SELECT?
--Jeff Moden
Change is inevitable... Change for the better is not.
December 30, 2015 at 3:11 pm
Jeff Moden (12/30/2015)
Generating data to the ms across a large range of dates is very important to testing.
I would say that depends on what you're testing and the granularity of the expected results. In this case, we're looking at timecard data. Here I expect the granularity to be no smaller than 1/4 hour, so having data to the nearest minute should be sufficient, but having data to the nearest second is not unreasonable. Requiring data to the nearest ms is completely unreasonable, especially since--as you already mentioned--most timecard systems don't even record data to the nearest ms.
Drew
J. Drew Allen
Business Intelligence Analyst
Philadelphia, PA
Viewing 15 posts - 1 through 15 (of 38 total)
You must be logged in to reply to this topic. Login to reply