May 14, 2007 at 7:42 am
I need to write a report that is a summary per date range, from Monday to Sunday. Obviously there is a datetime field in the data. Any ideas on how to do this?
R
May 14, 2007 at 8:17 am
What kind of summary, totals per day? Totals per week? List of records per day? A example table, data and the report output would be most helpful. Otherwise lookup date functions in the BOL as they will probably be more helful then my guessing what you want.
James.
May 14, 2007 at 8:57 am
You'll want to set DATEFIRST to Monday for this, and then you can use datepart(wk, <date> ) to get the week of the year. Grouping on that (and on year, if this can span years) should get you what you want.
May 17, 2007 at 6:11 am
For example I have the following data set:
20259 | Bleskop Shaft | CMA | 2007-01-03 13:33:12.100 |
20259 | Bleskop Shaft | LTI | 2007-02-01 17:16:18.527 |
20259 | Bleskop Shaft | LTI | 2007-01-11 06:33:45.703 |
20259 | Bleskop Shaft | LTI | 2007-01-16 15:13:06.280 |
20259 | Bleskop Shaft | MTC | 2007-01-17 11:42:55.833 |
20259 | Bleskop Shaft | MTC | 2007-01-03 07:39:24.243 |
20919 | Boschfontein Shaft | CMA | 2007-01-26 08:42:25.293 |
20919 | Boschfontein Shaft | LTI | 2007-01-26 09:44:15.673 |
20919 | Boschfontein Shaft | LTI | 2007-01-25 14:12:19.307 |
20919 | Boschfontein Shaft | LTI | 2007-01-29 16:59:09.997 |
20919 | Boschfontein Shaft | LTI | 2007-05-17 07:26:47.450 |
20919 | Boschfontein Shaft | LTI | 2007-01-03 07:32:57.887 |
20919 | Boschfontein Shaft | LTI | 2007-01-30 11:59:22.060 |
20919 | Boschfontein Shaft | LTI | 2007-01-08 08:25:16.190 |
20919 | Boschfontein Shaft | MTC | 2007-01-08 17:02:57.863 |
20919 | Boschfontein Shaft | MTC | 2007-01-12 07:39:19.193 |
20919 | Boschfontein Shaft | MTC | 2007-01-18 08:31:46.087 |
20919 | Boschfontein Shaft | MTC | 2007-01-26 06:49:21.477 |
20919 | Boschfontein Shaft | MTC | 2007-01-23 11:39:26.317 |
20260 | Brakspruit Shaft | LTI | 2007-01-02 16:07:19.720 |
20254 | Central Services | CMA | 2007-01-11 07:42:26.553 |
22064 | Central Services | S - INJ | 2007-01-17 10:26:40.513 |
20261 | Frank Shafts | CMA | 2007-01-10 14:36:27.660 |
20261 | Frank Shafts | LTI | 2007-01-09 13:58:42.403 |
20261 | Frank Shafts | LTI | 2007-01-09 14:17:50.707 |
20261 | Frank Shafts | LTI | 2007-01-05 11:53:51.700 |
20261 | Frank Shafts | LTI | 2007-01-02 15:14:38.177 |
20261 | Frank Shafts | LTI | 2007-01-30 11:49:57.253 |
20261 | Frank Shafts | LTI | 2007-02-01 06:57:11.467 |
20261 | Frank Shafts | LTI | 2007-01-16 16:16:09.397 |
I have to split up the data in two weekly or weeklyk periods and do a count on the of records for that specifc two week period. The field is in datetime format.
Regards
R
May 17, 2007 at 6:11 am
For example I have the following data set:
20259 | Bleskop Shaft | CMA | 2007-01-03 13:33:12.100 |
20259 | Bleskop Shaft | LTI | 2007-02-01 17:16:18.527 |
20259 | Bleskop Shaft | LTI | 2007-01-11 06:33:45.703 |
20259 | Bleskop Shaft | LTI | 2007-01-16 15:13:06.280 |
20259 | Bleskop Shaft | MTC | 2007-01-17 11:42:55.833 |
20259 | Bleskop Shaft | MTC | 2007-01-03 07:39:24.243 |
20919 | Boschfontein Shaft | CMA | 2007-01-26 08:42:25.293 |
20919 | Boschfontein Shaft | LTI | 2007-01-26 09:44:15.673 |
20919 | Boschfontein Shaft | LTI | 2007-01-25 14:12:19.307 |
20919 | Boschfontein Shaft | LTI | 2007-01-29 16:59:09.997 |
20919 | Boschfontein Shaft | LTI | 2007-05-17 07:26:47.450 |
20919 | Boschfontein Shaft | LTI | 2007-01-03 07:32:57.887 |
20919 | Boschfontein Shaft | LTI | 2007-01-30 11:59:22.060 |
20919 | Boschfontein Shaft | LTI | 2007-01-08 08:25:16.190 |
20919 | Boschfontein Shaft | MTC | 2007-01-08 17:02:57.863 |
20919 | Boschfontein Shaft | MTC | 2007-01-12 07:39:19.193 |
20919 | Boschfontein Shaft | MTC | 2007-01-18 08:31:46.087 |
20919 | Boschfontein Shaft | MTC | 2007-01-26 06:49:21.477 |
20919 | Boschfontein Shaft | MTC | 2007-01-23 11:39:26.317 |
20260 | Brakspruit Shaft | LTI | 2007-01-02 16:07:19.720 |
20254 | Central Services | CMA | 2007-01-11 07:42:26.553 |
22064 | Central Services | S - INJ | 2007-01-17 10:26:40.513 |
20261 | Frank Shafts | CMA | 2007-01-10 14:36:27.660 |
20261 | Frank Shafts | LTI | 2007-01-09 13:58:42.403 |
20261 | Frank Shafts | LTI | 2007-01-09 14:17:50.707 |
20261 | Frank Shafts | LTI | 2007-01-05 11:53:51.700 |
20261 | Frank Shafts | LTI | 2007-01-02 15:14:38.177 |
20261 | Frank Shafts | LTI | 2007-01-30 11:49:57.253 |
20261 | Frank Shafts | LTI | 2007-02-01 06:57:11.467 |
20261 | Frank Shafts | LTI | 2007-01-16 16:16:09.397 |
I have to split up the data in two weekly or weeklyk periods and do a count on the of records for that specifc two week period. The field is in datetime format.
Regards
R
May 17, 2007 at 6:36 am
Since you didn't give an example of the output expected this is my best guess based on the information provided:
set datefirst 1 --1 = Monday, 2 = Tuesday, etc
select min(date_) as start_, max(date_) as stop_, count(*) as count_
from t1_
group by datepart(wk,date_)
order by min(date_)
/* -- This part just creates a test table and loads your data for testing with the above statement
set nocount on
if object_id('t1_','u') is not null drop table t1_
go
create table t1_ (Col1_ int, col2_ varchar(20), col3_ char(8), date_ datetime)
go
insert into t1_ values ('20259','Bleskop Shaft','CMA','2007-01-03 13:33:12.100')
insert into t1_ values ('20259','Bleskop Shaft','LTI','2007-02-01 17:16:18.527')
insert into t1_ values ('20259','Bleskop Shaft','LTI','2007-01-11 06:33:45.703')
insert into t1_ values ('20259','Bleskop Shaft','LTI','2007-01-16 15:13:06.280')
insert into t1_ values ('20259','Bleskop Shaft','MTC','2007-01-17 11:42:55.833')
insert into t1_ values ('20259','Bleskop Shaft','MTC','2007-01-03 07:39:24.243')
insert into t1_ values ('20919','Boschfontein Shaft','CMA','2007-01-26 08:42:25.293')
insert into t1_ values ('20919','Boschfontein Shaft','LTI','2007-01-26 09:44:15.673')
insert into t1_ values ('20919','Boschfontein Shaft','LTI','2007-01-25 14:12:19.307')
insert into t1_ values ('20919','Boschfontein Shaft','LTI','2007-01-29 16:59:09.997')
insert into t1_ values ('20919','Boschfontein Shaft','LTI','2007-05-17 07:26:47.450')
insert into t1_ values ('20919','Boschfontein Shaft','LTI','2007-01-03 07:32:57.887')
insert into t1_ values ('20919','Boschfontein Shaft','LTI','2007-01-30 11:59:22.060')
insert into t1_ values ('20919','Boschfontein Shaft','LTI','2007-01-08 08:25:16.190')
insert into t1_ values ('20919','Boschfontein Shaft','MTC','2007-01-08 17:02:57.863')
insert into t1_ values ('20919','Boschfontein Shaft','MTC','2007-01-12 07:39:19.193')
insert into t1_ values ('20919','Boschfontein Shaft','MTC','2007-01-18 08:31:46.087')
insert into t1_ values ('20919','Boschfontein Shaft','MTC','2007-01-26 06:49:21.477')
insert into t1_ values ('20919','Boschfontein Shaft','MTC','2007-01-23 11:39:26.317')
insert into t1_ values ('20260','Brakspruit Shaft','LTI','2007-01-02 16:07:19.720')
insert into t1_ values ('20254','Central Services','CMA','2007-01-11 07:42:26.553')
insert into t1_ values ('22064','Central Services','S-INJ','2007-01-17 10:26:40.513')
insert into t1_ values ('20261','Frank Shafts','CMA','2007-01-10 14:36:27.660')
insert into t1_ values ('20261','Frank Shafts','LTI','2007-01-09 13:58:42.403')
insert into t1_ values ('20261','Frank Shafts','LTI','2007-01-09 14:17:50.707')
insert into t1_ values ('20261','Frank Shafts','LTI','2007-01-05 11:53:51.700')
insert into t1_ values ('20261','Frank Shafts','LTI','2007-01-02 15:14:38.177')
insert into t1_ values ('20261','Frank Shafts','LTI','2007-01-30 11:49:57.253')
insert into t1_ values ('20261','Frank Shafts','LTI','2007-02-01 06:57:11.467')
insert into t1_ values ('20261','Frank Shafts','LTI','2007-01-16 16:16:09.397')
*/
May 17, 2007 at 7:19 am
Thanks A LOT! Worked like a charm...
May 17, 2007 at 6:09 pm
Simple, elegant, and nasty fast, James. Well done...
--Jeff Moden
Change is inevitable... Change for the better is not.
May 18, 2007 at 6:48 am
Thanks for the feedback Jeff. I enjoy working out some of the T-SQL problems posted in the forum. In addition to keeping in practice, I also get to see alternative solutions posted by others, often yourself, which are often better than my original solution thereby helping me improve my skills.
James.
May 18, 2007 at 7:16 am
I share the same thoughts. You've certainly got the right idea about improving your own skills. Even though I've been doing this for a while, I still learn new things from this forum. It just can't be helped. For example... One guy recently posted a function with a loop in it... I normally use a Tally table (table of numbers) to resolve such things. Thought the guy was on crack for using a loop (didn't say that, of course) but it turned out that, for the particular problem at hand (hex to ascii conversion), the WHILE loop solution was a bit faster than the Tally table solution. I'm still trying to figure a way to beat his solution just because its interesting.
Not that I should judge but I wanted to pass along a compliment... most new folks don't jump in to try to wangle out problems and many of them have problems with coming up with sound solutions. Even when they do manage to erg out a solution, their code is pretty much crap in the readability department because they have the wrong attitude or the solution is way out in left field. And, speaking of attitude, some will come in swinging a bat about how great they are with handles that have "DBA" or "GURU" in it, bragging about how much time they supposedly have under their belt and, true or not, they either fail miserably or they turn everyone off with the constant beating of their own drum. Others are just plain rude about it or have a real chip on their shoulder.
I think our peers on this forum would agree with me in saying I've found NONE of that to be true in any of your posts and that you're a proverbial breath of fresh air. Thanks for being one of the good guys, James. It's a pleasure to "know" you.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 18, 2007 at 7:16 am
...see alternative solutions posted by others, often yourself, which are often better than my original solution... |
Especially with Jeff's contributions
Far away is close at hand in the images of elsewhere.
Anon.
May 20, 2007 at 4:46 pm
Daaaang... thanks, David Coming from the likes of you, that's quite the compliment.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 20, 2007 at 8:13 pm
Here is an alternate solution, using your test data. This one groups by the start of week date, using Monday as the start of the week.
This solution has a couple of advantages:
1. It is independant of the setting of DATEFIRST.
2. If you have data extending over multiple years, you do no have to worry about the week number repeating, because it groups by start of week date.
select -- Start of week date, week starting Monday -- See this link for Start of Week function or inline solution -- http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=47307 dateadd(dd,(datediff(dd,-53690,date_)/7)*7,-53690) as Monday, min(date_) as start, max(date_) as stop, count(*) as count from t1_ group by -- Start of week date, week starting Monday dateadd(dd,(datediff(dd,-53690,date_)/7)*7,-53690) order by dateadd(dd,(datediff(dd,-53690,date_)/7)*7,-53690)
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply