May 22, 2018 at 7:51 pm
In it's simplest form if I had a table that stored a persons name, date that the record was entered, and the amount of jobs done.. when I run the query in date order it produces 4 columns with the fourth indicating if the previous amount of jobs is more, less or static. Example below
NAME DATE AMOUNT of JOBS DIRECTION
John 12/jan/2017 3 Static
John 13/jan/2017 4 Up
John 14/jan/2017 2 Down
John 15/jan/2017 4 Up
John 16/jan/2017 4 Static
John 17/jan/2017 1 Down
CREATE TABLE test ([Name] VARCHAR(50), [Date] smalldatetime,[AMOUNT of JOBS] Int )
GO
INSERT INTO leavecard(wtname,ein,startdate,endate)
VALUES('John','12/jan/2017','3')
VALUES('John','13/jan/2017','4')
VALUES('John','14/jan/2017','2')
VALUES('John','15/jan/2017','4')
VALUES('John','16/jan/2017','4')
VALUES('John','17/jan/2017','1')
May 22, 2018 at 9:34 pm
mick burden - Tuesday, May 22, 2018 7:51 PMIn it's simplest form if I had a table that stored a persons name, date that the record was entered, and the amount of jobs done.. when I run the query in date order it produces 4 columns with the fourth indicating if the previous amount of jobs is more, less or static. Example belowNAME DATE AMOUNT of JOBS DIRECTION
John 12/jan/2017 3 Static
John 13/jan/2017 4 Up
John 14/jan/2017 2 Down
John 15/jan/2017 4 Up
John 16/jan/2017 4 Static
John 17/jan/2017 1 Down
CREATE TABLE test ([Name] VARCHAR(50), [Date] smalldatetime,[AMOUNT of JOBS] Int )
GOINSERT INTO leavecard(wtname,ein,startdate,endate)
VALUES('John','12/jan/2017','3')
VALUES('John','13/jan/2017','4')
VALUES('John','14/jan/2017','2')
VALUES('John','15/jan/2017','4')
VALUES('John','16/jan/2017','4')
VALUES('John','17/jan/2017','1')
Are you allowed to write code that uses TempDB directly? Since you're using 2008, I'm thinking that a "Quirky Update" would be the fastest and lightest weight.
You might also want to change your readily-consumable data code above... it's just not going to work as written.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 23, 2018 at 12:34 am
Jeff Moden - Tuesday, May 22, 2018 9:34 PMmick burden - Tuesday, May 22, 2018 7:51 PMIn it's simplest form if I had a table that stored a persons name, date that the record was entered, and the amount of jobs done.. when I run the query in date order it produces 4 columns with the fourth indicating if the previous amount of jobs is more, less or static. Example belowNAME DATE AMOUNT of JOBS DIRECTION
John 12/jan/2017 3 Static
John 13/jan/2017 4 Up
John 14/jan/2017 2 Down
John 15/jan/2017 4 Up
John 16/jan/2017 4 Static
John 17/jan/2017 1 Down
CREATE TABLE test ([Name] VARCHAR(50), [Date] smalldatetime,[AMOUNT of JOBS] Int )
GOINSERT INTO leavecard(wtname,ein,startdate,endate)
VALUES('John','12/jan/2017','3')
VALUES('John','13/jan/2017','4')
VALUES('John','14/jan/2017','2')
VALUES('John','15/jan/2017','4')
VALUES('John','16/jan/2017','4')
VALUES('John','17/jan/2017','1')Are you allowed to write code that uses TempDB directly? Since you're using 2008, I'm thinking that a "Quirky Update" would be the fastest and lightest weight.
You might also want to change your readily-consumable data code above... it's just not going to work as written.
Oops, I don't know what happened there Jeff, thanks for pointing it out. The revised code is below
CREATE TABLE test ([Name] VARCHAR(50), [Date] smalldatetime,[AMOUNT of JOBS] Int )
GO
INSERT INTO test(name,[date],[AMOUNT of JOBS])
VALUES('John','12/jan/2017','3')
INSERT INTO test(name,[date],[AMOUNT of JOBS])
VALUES('John','13/jan/2017','4')
INSERT INTO test(name,[date],[AMOUNT of JOBS])
VALUES('John','14/jan/2017','2')
INSERT INTO test(name,[date],[AMOUNT of JOBS])
VALUES('John','15/jan/2017','4')
INSERT INTO test(name,[date],[AMOUNT of JOBS])
VALUES('John','16/jan/2017','4')
INSERT INTO test(name,[date],[AMOUNT of JOBS])
VALUES('John','17/jan/2017','1')
May 23, 2018 at 2:03 am
Performance could be an issue here and Jeff's suggestion of Quirky Update would probably the best.
However, a possibility
WITH cte ([Name],[Date],[AMOUNT of JOBS],RowNum) AS (
SELECT [Name],[Date],[AMOUNT of JOBS],
ROW_NUMBER() OVER (PARTITION BY [Name] ORDER BY [Date] ASC)
FROM test
)
SELECT cte.[Name],cte.[Date],cte.[AMOUNT of JOBS],
CASE
WHEN prev.[AMOUNT of JOBS] IS NULL THEN 'Static'
WHEN prev.[AMOUNT of JOBS] = cte.[AMOUNT of JOBS] THEN 'Static'
WHEN prev.[AMOUNT of JOBS] < cte.[AMOUNT of JOBS] THEN 'Up'
ELSE 'Down'
END
FROM cte
LEFT JOIN cte prev ON prev.[Name] = cte.[Name] AND prev.RowNum = cte.RowNum-1;
Far away is close at hand in the images of elsewhere.
Anon.
May 23, 2018 at 2:29 am
Thanks David, that's a great help. I don't suppose you know what Jeff means about a quirky update ?
May 23, 2018 at 3:22 am
mick burden - Wednesday, May 23, 2018 2:29 AMThanks David, that's a great help. I don't suppose you know what Jeff means about a quirky update ?
Please see his article http://www.sqlservercentral.com/articles/T-SQL/68467/
Far away is close at hand in the images of elsewhere.
Anon.
May 23, 2018 at 6:38 am
mick burden - Wednesday, May 23, 2018 2:29 AMThanks David, that's a great help. I don't suppose you know what Jeff means about a quirky update ?
David's method is the classic method for handling such things in pre-2012 instances and it works quite well, in most cases. The thing to be aware of is that when you reference the same CTE in the FROM clause more than once, the CTE will be executed for each occurrence found in the FROM clause just like a VIEW would (rumor has it that 2018 or 2019 might provide the option to "materialize" the results of a CTE so that such duplicate execution would no longer be necessary but we'll see).
As David suggests, on a huge amount of data, it can become a bit of a performance problem, especially because of the double execution of the CTE.
As you can tell by the fervor in the discussion associated with the article that David provided a link for (and my apologizes for forgetting to post that link), a lot of people are aghast at the thought of using something that Microsoft won't support (and doesn't know how to use themselves). But, if you have a lot of data and the classic method that David was kind enough to post becomes a performance problem, then you might want to resort to the "Quirky Update".
If that's the case, post back and I'll write a demo for the consumable data you posted. The only reason I didn't do it last night was because I was operating on shoestring of sleep and didn't want to take the time to fix the original data to use it.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 23, 2018 at 7:24 am
Jeff Moden - Wednesday, May 23, 2018 6:38 AMAs David suggests, on a huge amount of data, it can become a bit of a performance problem, especially because of the double execution of the CTE.
Yes it is very easy to forget this when using cte's.
To overcome this (and other problems with cte's) I sometimes use an indexed temp table.
But as you often state 'it depends' 😀
Far away is close at hand in the images of elsewhere.
Anon.
May 23, 2018 at 7:47 am
David Burrows - Wednesday, May 23, 2018 7:24 AMJeff Moden - Wednesday, May 23, 2018 6:38 AMAs David suggests, on a huge amount of data, it can become a bit of a performance problem, especially because of the double execution of the CTE.Yes it is very easy to forget this when using cte's.
To overcome this (and other problems with cte's) I sometimes use an indexed temp table.
But as you often state 'it depends' 😀
Totally agreed.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 23, 2018 at 9:50 am
Jeff Moden - Wednesday, May 23, 2018 6:38 AMmick burden - Wednesday, May 23, 2018 2:29 AMThanks David, that's a great help. I don't suppose you know what Jeff means about a quirky update ?David's method is the classic method for handling such things in pre-2012 instances and it works quite well, in most cases. The thing to be aware of is that when you reference the same CTE in the FROM clause more than once, the CTE will be executed for each occurrence found in the FROM clause just like a VIEW would (rumor has it that 2018 or 2019 might provide the option to "materialize" the results of a CTE so that such duplicate execution would no longer be necessary but we'll see).
As David suggests, on a huge amount of data, it can become a bit of a performance problem, especially because of the double execution of the CTE.
As you can tell by the fervor in the discussion associated with the article that David provided a link for (and my apologizes for forgetting to post that link), a lot of people are aghast at the thought of using something that Microsoft won't support (and doesn't know how to use themselves). But, if you have a lot of data and the classic method that David was kind enough to post becomes a performance problem, then you might want to resort to the "Quirky Update".
If that's the case, post back and I'll write a demo for the consumable data you posted. The only reason I didn't do it last night was because I was operating on shoestring of sleep and didn't want to take the time to fix the original data to use it.
If that rumor is true it will make the T-SQL CTE more like the Oracle sub-query refactoring clause. That would actually be a really nice change.
May 30, 2018 at 9:22 am
Lynn Pettis - Wednesday, May 23, 2018 9:50 AMJeff Moden - Wednesday, May 23, 2018 6:38 AMmick burden - Wednesday, May 23, 2018 2:29 AMThanks David, that's a great help. I don't suppose you know what Jeff means about a quirky update ?David's method is the classic method for handling such things in pre-2012 instances and it works quite well, in most cases. The thing to be aware of is that when you reference the same CTE in the FROM clause more than once, the CTE will be executed for each occurrence found in the FROM clause just like a VIEW would (rumor has it that 2018 or 2019 might provide the option to "materialize" the results of a CTE so that such duplicate execution would no longer be necessary but we'll see).
As David suggests, on a huge amount of data, it can become a bit of a performance problem, especially because of the double execution of the CTE.
As you can tell by the fervor in the discussion associated with the article that David provided a link for (and my apologizes for forgetting to post that link), a lot of people are aghast at the thought of using something that Microsoft won't support (and doesn't know how to use themselves). But, if you have a lot of data and the classic method that David was kind enough to post becomes a performance problem, then you might want to resort to the "Quirky Update".
If that's the case, post back and I'll write a demo for the consumable data you posted. The only reason I didn't do it last night was because I was operating on shoestring of sleep and didn't want to take the time to fix the original data to use it.
If that rumor is true it will make the T-SQL CTE more like the Oracle sub-query refactoring clause. That would actually be a really nice change.
Many thanks Lynn, I would appreciate a demo if that's ok with you ?
Kind regards
Mick
May 30, 2018 at 1:23 pm
Another alternative is to get your version of SQL Server up to at least the 2012 version, and then you can use LAG and thus will no longer need the CTE.
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
May 30, 2018 at 2:45 pm
@Mick,
Are you all set or do you still have issues with this problem?
--Jeff Moden
Change is inevitable... Change for the better is not.
May 31, 2018 at 12:22 am
Hi Jeff, no issues as such. It's just nice to find various alternatives to the same solution. I find them a great way of learning. As for Steve's suggestion of upgrading to 2012 or above that's not a possibility do to costs. I love this forum, even if I'm not posting asking questions it's great to trawl through looking at other posts. It's a great place to learn.
May 31, 2018 at 9:38 am
mick burden - Thursday, May 31, 2018 12:22 AMHi Jeff, no issues as such. It's just nice to find various alternatives to the same solution. I find them a great way of learning. As for Steve's suggestion of upgrading to 2012 or above that's not a possibility do to costs. I love this forum, even if I'm not posting asking questions it's great to trawl through looking at other posts. It's a great place to learn.
Cool. I just wanted to be absolutely sure that you were all set. And thanks for the feedback.
If performance does become an issue with the method you're using, post back and I'll show you how to safely use the "Quirky Update" method to blast through a million rows in just a couple of seconds.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 15 (of 24 total)
You must be logged in to reply to this topic. Login to reply