May 13, 2010 at 9:53 am
Hello,
I have a table that looks like this.
projectId (char(15)) | departmentId (char(7)) | month (int) | year (int) | ID (int) | Plan (int) | ETC (int)
'CA D10' '171' 3 2009 1 ? ?
'CA D10' '171' 4 2009 2 ? ?
'CA D10' '311' 3 2009 3 ? ?
'CA D10' '311' 4 2009 4 ? ?
'CA D10' '500' 3 2009 5 ? ?
'CA D10' '500' 4 2009 6 ? ?
I have to select departmentId, month, and year by projectid, which is easy, but I have to present it like this, which is not so easy, at least for me.
departmentId | month_year | month_year
'171' 3/2009 4/2009
'311' 3/2009 4/2009
'500' 3/2009 4/2009
There could be an indefinite amount of month/year because it represents each month of a project, and a project could last two months, 12 months, or 60 months.
To add to this, I have to keep track of the ID somehow because I have to keep track of the row ID so the user can update the row with a number for the plan field, and later the ETC field.
I guess I have two questions. First, even if it is possible to manipulate the data to display correctly, is there a better way to structure the table? Basically, this table represents a projects "buckets". When a project is in the planning stages one group adds plan numbers which represent the estimated hours in that month to complete the project. Later, during the project a different group of people will enter ETC hours for each month in the projects span.
Thank you for your help.
May 13, 2010 at 10:29 am
It would be much easier if you provided table scripts and sample data. Can the project be in the table more that twice? If so, how should that be handled and what would be your expected results? What have you tried so far?
For better, quicker answers, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
May 13, 2010 at 10:29 am
I think this does what you want, but I would probably have another think about how the table designed if you can
DECLARE @test-2 TABLE
(
ProjectID CHAR(15),
DepartmentID CHAR(7),
[Month] INT,
[Year] INT,
ID INT
);
INSERT INTO @test-2 (ProjectID, DepartmentID, [Month], [Year], ID)
SELECT 'CA D10','171',3,2009,1 UNION ALL
SELECT 'CA D10','171',4,2009,2 UNION ALL
SELECT 'CA D10','311',3,2009,3 UNION ALL
SELECT 'CA D10','311',4,2009,4 UNION ALL
SELECT 'CA D10','500',3,2009,5 UNION ALL
SELECT 'CA D10','500',4,2009,6;
;WITH Test
AS
(
SELECT T.ProjectID,
T.DepartmentID,
CA.MonYr,
ROW_NUMBER() OVER (PARTITION BY ProjectID, DepartmentID ORDER BY MonYr ASC) AS RN,
ROW_NUMBER() OVER (PARTITION BY ProjectID, DepartmentID ORDER BY MonYr ASC) AS RN1,
T.ID
FROM@test-2 T
CROSS APPLY (SELECT RIGHT('00' + CAST([Month] AS VARCHAR(2)),2) + '/' + CAST([Year] AS VARCHAR(4)) AS MonYr) AS CA
)
SELECT
P1.ProjectID,
P1.DepartmentID,
COALESCE(MAX(P1.[1]), NULL) AS MonYrA,
COALESCE(MAX(P1.[2]), NULL) AS MonYrB,
COALESCE(MAX(P2.[1]), NULL) AS IDA,
COALESCE(MAX(P2.[2]), NULL) AS IDB
FROM TEST AS T1
PIVOT (MAX(MonYr) FOR RN IN ([1], [2])) AS P1
INNER JOIN Test AS T2
PIVOT (MAX(ID) FOR RN1 IN ([1], [2])) AS P2
ON P1.ProjectID = P2.ProjectID AND P1.DepartmentID = P2.DepartmentID
GROUP BY P1.ProjectID, P1.DepartmentID
May 13, 2010 at 11:48 am
krenshau (5/13/2010)
Hello,I have a table that looks like this.
projectId (char(15)) | departmentId (char(7)) | month (int) | year (int) | ID (int) | Plan (int) | ETC (int)
'CA D10' '171' 3 2009 1 ? ?
'CA D10' '171' 4 2009 2 ? ?
'CA D10' '311' 3 2009 3 ? ?
'CA D10' '311' 4 2009 4 ? ?
'CA D10' '500' 3 2009 5 ? ?
'CA D10' '500' 4 2009 6 ? ?
I have to select departmentId, month, and year by projectid, which is easy, but I have to present it like this, which is not so easy, at least for me.
departmentId | month_year | month_year
'171' 3/2009 4/2009
'311' 3/2009 4/2009
'500' 3/2009 4/2009
There could be an indefinite amount of month/year because it represents each month of a project, and a project could last two months, 12 months, or 60 months.
To add to this, I have to keep track of the ID somehow because I have to keep track of the row ID so the user can update the row with a number for the plan field, and later the ETC field.
I guess I have two questions. First, even if it is possible to manipulate the data to display correctly, is there a better way to structure the table? Basically, this table represents a projects "buckets". When a project is in the planning stages one group adds plan numbers which represent the estimated hours in that month to complete the project. Later, during the project a different group of people will enter ETC hours for each month in the projects span.
Thank you for your help.
Nice try on the posted data... for future posts and better/quicker answers, plsea see the article at the first link in my signature line below. Thanks.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 14, 2010 at 7:39 pm
Jeff, I have been thinking about whether or not to reply to your post. First, thank you for informing me of the rules. However, I find it interesting that you said that I could get faster and better answers. The two replies I received were done in about 30 minutes. I think that they were quite informative as well. Are you saying that if I had formatted my information according to the rules I could have received an answer in less than 30 minutes? Are you saying that the information provide was faulty or less than optimal? That's sounds like an insult to the people that tried to help me. You, in fact, offered no solution, and you didn't follow the rules of grammar and spelling when you made your post. Which rules are more important, grammar and spelling or a forums guidelines?
Don't get me wrong, despite the fact that I believe you will probably delete this post and ban me from the forums, I would follow the rules next time. I just don't suffer condescending and insulting authoritarians that are more concerned with "rules" than helping a fellow human being, while not following other more important rules at the same time.
I get that you have a lot of slackers come on here and dump crap and say fix it for me, I do. I don't think I did that. I would have thought a more courteous response from you would have been more appropriate.
Again, thank you for informing me of the rules in such an unhelpful way. If allowed, I will follow them next time.
May 14, 2010 at 10:45 pm
krenshau (5/14/2010)
Jeff, I have been thinking about whether or not to reply to your post. First, thank you for informing me of the rules. However, I find it interesting that you said that I could get faster and better answers. The two replies I received were done in about 30 minutes. I think that they were quite informative as well. Are you saying that if I had formatted my information according to the rules I could have received an answer in less than 30 minutes? Are you saying that the information provide was faulty or less than optimal? That's sounds like an insult to the people that tried to help me. You, in fact, offered no solution, and you didn't follow the rules of grammar and spelling when you made your post. Which rules are more important, grammar and spelling or a forums guidelines?Don't get me wrong, despite the fact that I believe you will probably delete this post and ban me from the forums, I would follow the rules next time. I just don't suffer condescending and insulting authoritarians that are more concerned with "rules" than helping a fellow human being, while not following other more important rules at the same time.
I get that you have a lot of slackers come on here and dump crap and say fix it for me, I do. I don't think I did that. I would have thought a more courteous response from you would have been more appropriate.
Again, thank you for informing me of the rules in such an unhelpful way. If allowed, I will follow them next time.
While I'm not Jeff, and I won't speak for him, I will attempt to explain things.
First, you did have two posters answer you. The first asked several questions, because you didn't supply any table DDL and the sample data would have to be massaged to work. In fact, if you look at the second post, this is what Dohsan did. Now, how long did it take to do that? Yes, it is cut-and-paste, then play around with appropriate sql commands to create the table DDL and the insert statements. Even this takes time... I've spent > 15 minutes getting some sample data posted like you did into a usable format... and many people that help out here will just skip right over questions where they can't just copy/paste code to set up the problem. But, if Dohsan had been able to just cut-and-paste the code that he ended up using, he would have come up with a faster answer. So yes, you would have received a faster answer. Notice also that even though Dohsan did provide the table DDL for his solution, that this did not answer all of Mike01's questions. If you had provided the table DDL, complete with constraints, then these questions probably would not have been necessary, and Mike01 could have had an answer also for you.
Second, unlike almost all other forums, SQLServerCentral.com does not delete posts, and I've never heard of them banning anybody.
Remember, we are all volunteers here. So, HELP US HELP YOU by following the advice in the article that both Jeff and Mike01 referenced (note also that it's the first link in my signature... and in many of the other regular volunteers' signatures here on SSC!)
Wayne
Microsoft Certified Master: SQL Server 2008
Author - SQL Server T-SQL Recipes
May 15, 2010 at 10:11 am
krenshau (5/14/2010)
Jeff, I have been thinking about whether or not to reply to your post. First, thank you for informing me of the rules. However, I find it interesting that you said that I could get faster and better answers. The two replies I received were done in about 30 minutes. I think that they were quite informative as well. Are you saying that if I had formatted my information according to the rules I could have received an answer in less than 30 minutes? Are you saying that the information provide was faulty or less than optimal? That's sounds like an insult to the people that tried to help me. You, in fact, offered no solution, and you didn't follow the rules of grammar and spelling when you made your post. Which rules are more important, grammar and spelling or a forums guidelines?Don't get me wrong, despite the fact that I believe you will probably delete this post and ban me from the forums, I would follow the rules next time. I just don't suffer condescending and insulting authoritarians that are more concerned with "rules" than helping a fellow human being, while not following other more important rules at the same time.
I get that you have a lot of slackers come on here and dump crap and say fix it for me, I do. I don't think I did that. I would have thought a more courteous response from you would have been more appropriate.
Again, thank you for informing me of the rules in such an unhelpful way. If allowed, I will follow them next time.
Oh jeez... another English major and spelling nit that has no recourse put to pick on quickly typed forum posts. Look... the first reply you got was a reply saying basically the same thing I said. The second reply you got was from a really nice guy that took pity on you and formatted your test data for an actual demonstration because you're a newbie or maybe just because he really is a nice guy. And if you really understood the crap that some "slackers" post, you wouldn't be busting my chops for trying to help a newbie like you do it right in the future. The reason I posted at all was to help you get better answers in the future so lighten up. 😉
Which rules are more important, grammar and spelling or a forums guidelines?
If you want proper grammer and all text to go through a spell-checker, go read a book on English. If you want good code, then get over yourself, read "the book" on how to post, and be thankful that someone took the time to point it out to you. 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
May 16, 2010 at 1:05 pm
Please keep it professional. Asking for more information is acceptable, and encouraged. It's not necessarily for this post, but for future ones. Someone can knock out a solution in 5 minutes easily if you post the DDL and sample data.
May 18, 2010 at 8:41 am
I was not against asking me for more information, or for me to format it properly. I want to do everything I can to help others help me. It was the way it was stated. Even in the reply statements like, "...get over yourself.." are rude. It makes me not want to come back. I am not an english major, but I can be overly critical and insulting just like all of Jeff's messages have been, that was my point.
May 18, 2010 at 8:48 am
Jeff was a little short after your reply, but how is this rude?
Nice try on the posted data... for future posts and better/quicker answers, plsea see the article at the first link in my signature line below. Thanks.
There's a typo in there, but I would think that wouldn't be a reason to respond with a crack about someone's English. There was nothing in there, but short comment that there is information given about how to better format things for help.
I think you are overreacting to something that isn't there. You asked a good question, you showed some thought, and there was nothing rude or wrong with it. It just wasn't setup well for someone to help you. Nothing wrong with that either, someone just noted that a few changes on your part would make it more likely for someone else to help.
May 18, 2010 at 2:15 pm
To me, the "nice try" seemed rude. Also, the for quicker and better responses part seemed condescending because I thought the posts were quick and helpful. There is no reason to keep dragging this out, though. I think we both said what we felt, and that is good. I appreciate your responses, and understanding. Perhaps, he didn't mean it to be rude, perhaps it is his personality to be a little snippy, or maybe the intent was lost in the medium. In any case, we talked, or wrote, it out and I think all that there is to say about it has been said. No hard feelings.
I will try to follow the guidelines next time, but I may have to ask for help in how to do some of it after having a quick glance at them.
Thank you.
May 18, 2010 at 4:14 pm
I actually did mean "Nice Try" as in "good job" and "almost right". It was meant to be a compliment because I've seen many posts that don't even come close to being a decent try. Poor choice of words on my part, apparently.
As for the rest of the stuff... it was all self defense after the first retort and I was as bewildered and ticked as the OP apparently was.
Sooooo.... what say we start over?
--Jeff Moden
Change is inevitable... Change for the better is not.
May 18, 2010 at 4:49 pm
Sooooo.... what say we start over?
That sounds good. I guess I was being too cynical. I apologize.
May 18, 2010 at 11:49 pm
Heh... ok... time to get down to some T-SQL. 😛
The following code builds some test data. Actually, it builds a lot of data just to be sure we have a scalable solution. In this case, 100,000 rows works out as a pretty good test. As a side bar, this is why we ask for readily consumable test data because it 1) shows the structure of the table, 2) saves the person trying to help time, and 3) usually makes it unnecessary to build our own test data which allows us to concentrate more on the problem. Heh... and you defined a heck of a problem. Dynamic, concatenated, sorted, integer columns isn't the easiest thing in the world to come up with. 😀
Here's the code to build the 100k rows of data. Don't let that number scare you. The code takes less than 4 seconds to build and index on my 8 year old desktop box...
--====================================================================================================================
-- Test setup. This just creates a test table and populates it with data. It isn't a part of the solution.
--====================================================================================================================
--===== Conditionally drop the test table to make reruns easier.
-- Note that this is NOT a part of the solution. It's just a test setup.
IF OBJECT_ID('TempDB..#TestTable','U') IS NOT NULL
DROP TABLE #TestTable
;
--===== Create a 100,000 rows of data and create the test table on the fly.
-- Note that this is NOT a part of the solution. It's just a test setup.
-- This takes less than 4 seconds to build.
SELECT TOP 100000
CAST('CA 31' + LEFT(NEWID(),1) AS CHAR(15)) AS ProjectID,
CAST(ABS(CHECKSUM(NEWID())) % 100 AS CHAR(7)) AS DepartmentID,
CAST(ABS(CHECKSUM(NEWID())) % 12 + 1 AS INT) AS Month,
CAST(ABS(CHECKSUM(NEWID())) % 10 + 2000 AS INT) AS Year,
IDENTITY(INT,1,1) AS ID
INTO #TestTable
FROM Master.sys.All_Columns ac1
CROSS JOIN Master.sys.All_Columns ac2
;
--===== Add a clustered index to speed things along
CREATE CLUSTERED INDEX IX_#TestTable
ON #TestTable (ProjectID, DepartmentID)
;
--===== Display the top 1000 rows just to see what we have for test data.
-- Note that this is NOT a part of the solution. It's just a test verification.
SELECT TOP 1000 *
FROM #TestTable
;
Ok, so much for test data. Here's the code that solves 90% of all the problems you stated. It doesn't do the "ID" thing because I thought I'd run this part of the code by you to make sure it's what you wanted before throwing in the kitchen sink. 😉 Commented details are in the code...
--====================================================================================================================
-- This is the solution. Note that any variable beginning with @p could be a parameter in a stored procedure.
--====================================================================================================================
--===== Declare local variables
DECLARE @sql VARCHAR(MAX),
@pStartMonth DATETIME,
@pEndMonth DATETIME
;
--===== Here's the dates we want to check
SELECT @pStartMonth = 'JAN 2000',
@pEndMonth = 'DEC 2009'
;
--===== This builds the static portion of the SELECT list
SELECT @sql = 'SELECT ProjectID, DepartmentID,'
;
--===== This builds the dynamic portion of the SELECT list using XML concatenation.
-- It actually builds the body of a CROSS TAB which is a fair bit faster than PIVOT.
+ STUFF((
SELECT ',' + CHAR(10) + 'MAX(CASE WHEN Month='
+ CAST(DATEPART(mm,(DATEADD(mm, t.Number, @pStartMonth))) AS VARCHAR(2))
+ ' AND Year='
+ DATENAME(yy,(DATEADD(mm, t.Number, @pStartMonth)))
+ ' THEN STR(Month,2)+''/''+STR(YEAR,4) ELSE '''' END) AS Month_Year'
FROM Master.dbo.spt_Values t
WHERE t.Type = 'P'
AND t.Number BETWEEN 0 AND DATEDIFF(mm, @pStartMonth, @pEndMonth)
FOR XML PATH('')
),1,1,'')
;
--===== This builds the static FROM clause, GROUP BY, and ORDER BY clauses
+ 'FROM #TestTable GROUP BY ProjectID, DepartmentID ORDER BY ProjectID, DepartmentID'
;
--==== And, finally, we get our answer by executing the dynamic SQL
-- This takes less than 11 seconds to display in the grid mode of SSMS.
EXEC (@SQL)
1600 ProjectID/DepartmentID's spread across a very programmable (just change the start and end dates) 120 months (ten years) for 100,000 rows of highly randomized data in 11 seconds and it doesn't care if there are accidental dupes or not. Let me know if that's kind of what you had in mind. 🙂
--Jeff Moden
Change is inevitable... Change for the better is not.
May 19, 2010 at 12:46 am
Hmmmm... for some reason, the reply didn't "bump" on "Posts Added Today". I hope the OP got notified.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 15 (of 17 total)
You must be logged in to reply to this topic. Login to reply