January 27, 2012 at 3:49 am
This is a good example of the hoops you sometimes have to go through to carry out some conceptually quite simple tasks in basic T-SQL. For a much simpler approach to this problem (and many others) you might want to check out our T-SQL function libraries: http://www.totallysql.com/products/sqlrollingstats.
Charles Southey
www.totallysql.com
January 27, 2012 at 4:07 am
charles.southey (1/27/2012)
This is a good example of the hoops you sometimes have to go through to carry out some conceptually quite simple tasks in basic T-SQL. For a much simpler approach to this problem (and many others) you might want to check out our T-SQL function libraries: http://www.totallysql.com/products/sqlrollingstats.
It's a shame you decided to release this as a commercial product (with ambitious pricing!), rather than as open source on a site like CodePlex. The choice to use SQLCLR UDTs is an interesting one; I have not found these to perform very well in general. It is concerning that you use shared state in a core UNSAFE assembly; it seems unlikely you are able to clean up successfully in all situations e.g. after a 'rude abort'. Nice idea in some ways, pity about the commercialism.
January 29, 2012 at 7:58 pm
Nice and enlightening ... on a par with your original tally table article.
Cheers,
Pete.
January 29, 2012 at 10:09 pm
PBaekdal (1/29/2012)
Nice and enlightening ... on a par with your original tally table article.Cheers,
Pete.
Thanks Pete. I really appreciate the feedback.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 31, 2012 at 4:16 am
Great article, and an excellent technique I've been utilising (with care!) for a while.
I do however raise an argument against your statement that "The "Quirky Update" will work just fine with all 2 part SET statements".
Try making the Update @OMG statement into this, using only 2 part statements, and see what you get 🙂
UPDATE @OMG
SET
@N = @N + 1,
@N = @N + 1,
SomeInt = @N + 1
FROM @OMG
OPTION (MAXDOP 1)
Again we see that it works fine for the first row, but then fails after that. As mentioned elsewhere, it is more a factor of updating the variable twice rather than it being in 2 or 3 part statements.
Tim
February 13, 2012 at 10:50 am
I just read this article about SQL Server 2012's new features (http://blog.tallan.com/2011/12/08/sql-server-2012-windowing-functions-part-1-of-2-running-and-sliding-aggregates/) and I'm thinking it *might* solve my problem, as long as I can join to other tables. I had to give up on the Quirky Update because I had to try to jam every column I needed into the same row and in some cases the row was too wide for SQL Server to handle.
February 13, 2012 at 11:05 am
gcresse (2/13/2012)
I just read this article about SQL Server 2012's new features (http://blog.tallan.com/2011/12/08/sql-server-2012-windowing-functions-part-1-of-2-running-and-sliding-aggregates/) and I'm thinking it *might* solve my problem, as long as I can join to other tables. I had to give up on the Quirky Update because I had to try to jam every column I needed into the same row and in some cases the row was too wide for SQL Server to handle.
whilst you wait for 2012.....could you consider creating a temp table from all of your sub queries / joins etc....and then running QU on the temp table?
________________________________________________________________
you can lead a user to data....but you cannot make them think
and remember....every day is a school day
February 13, 2012 at 12:44 pm
The temp table would not be wide enough. I need the ability to calculate running percents on each component in the final product, and often times there are thousands of them.
February 13, 2012 at 12:53 pm
gcresse (2/13/2012)
The temp table would not be wide enough. I need the ability to calculate running percents on each component in the final product, and often times there are thousands of them.
ok.. I cant see what you can see......but in your situation then I would be asking myself "why have I got too many columns...should I be thinking about rows instead?"...I assume this is what you mean by "not be wide enough"
anyway good luck.
kind regards
________________________________________________________________
you can lead a user to data....but you cannot make them think
and remember....every day is a school day
February 13, 2012 at 5:00 pm
J Livingston SQL (2/13/2012)
gcresse (2/13/2012)
The temp table would not be wide enough. I need the ability to calculate running percents on each component in the final product, and often times there are thousands of them.ok.. I cant see what you can see......but in your situation then I would be asking myself "why have I got too many columns...should I be thinking about rows instead?"...I assume this is what you mean by "not be wide enough"
anyway good luck.
kind regards
I agree... think rows instead of columns here even if you don't use the QU.
As a sidebar, any process that has thousands of steps should probably be reevaluated a bit. 😉
As another sidebar, can you move the discussion about such a wide table to a new thread, please? Thanks.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 14, 2014 at 5:30 am
Jeff - on behalf of my client 😛
see http://www.sqlservercentral.com/Forums/Topic1550386-392-1.aspx
I took a stored procedure, modified (but not enough) which had used cursors in Oracle that ran for 88 minutes 2 nights ago
last night it ran in 4 minutes
you are so right RBAR is truly evil
May 28, 2014 at 8:11 pm
Has anyone done any significant testing of the Quirky Update method in SQL Server 2008, 2012, or 2014? Does it work in those versions too?
It's easy enough to test out a few million rows myself, but I trust the gurus here a lot more than I trust myself to make any authoritative determination on this.
May 28, 2014 at 9:50 pm
autoexcrement (5/28/2014)
Has anyone done any significant testing of the Quirky Update method in SQL Server 2008, 2012, or 2014? Does it work in those versions too?It's easy enough to test out a few million rows myself, but I trust the gurus here a lot more than I trust myself to make any authoritative determination on this.
http://blog.waynesheffield.com/wayne/archive/2011/08/running-totals-in-denali-ctp3/[/url]
Wayne
Microsoft Certified Master: SQL Server 2008
Author - SQL Server T-SQL Recipes
May 28, 2014 at 10:24 pm
Thanks, looks like QU is still a winner even in 2012. 🙂
September 26, 2014 at 5:47 pm
I came across the Quirky Update initially just last week in one of Phil Factor's "Speed Phreak" challenges, where in the lengthy commentary regarding posted candidate solutions, someone did compare it against the upgraded partition functions in SQL Server 2012, and it's still faster. I wasn't looking for it in particular, but saw "quirky update", and had to find out more!
It's hard to call this undocumented with all its "the evil" connotations people give this, when the syntax for this use of SET is right there in BOL (but not really explained well/in depth). This is a vestige from the mother code from Sybase, where it's probably been a standard, known speed-up for quite some time...
(yes, the syntax for it still in BOL for the UPDATE statement for SQL Server 2014, but no examples)
Viewing 15 posts - 271 through 285 (of 307 total)
You must be logged in to reply to this topic. Login to reply