October 3, 2009 at 6:16 am
Jeff Moden (10/2/2009)
Nope... not sure anymore. The machine at work favored Lynn's method. My machine at home favors the original method you posted above. Unfortunately, I've changed jobs and the machine "at work" is no longer available to me to double check.
The change I made was simply to change the OVER ORDER BY to order by SELECT NULL rather than object_id. It eliminates a full sort, which helps a great deal as you might imagine! The very fastest method is still the heavily modified constant scan one (several CTEs and some UNION ALLs) but there's precious little in it.
Summary of my feelings on the subject:
1. A tally table can be useful. If you need one, create a real one. I've yet to hear a good reason not to have one in model.
2. If you can't create a real one, regard the person telling you 'no' with suspicion. 🙂
3. A dynamic tally is probably only going to be appropriate for relatively small numbers of rows, say tens of thousands...?
4. You need a real tally table 😀
5. Lynn's method is cool.
6. The various refinements make very little difference for the sorts of sizes we should be talking about.
7. Did I mention that the best solution is a real table? :w00t:
Jeff Moden (10/2/2009)
Just one thing to be careful of, Paul. I did a Billion row test in the past using the same method as above. It caused the log file to grow to 40 gig. I recommend a single cross join which will give a bit more than 16 million rows quite nicely. For anything that requires more than the square of the rowcount in a given table, I do recommend one of the various permutations of Lynn's code shown in the article and the discussions. I used Itzek's base 2 code and it caused no log growth on the Billion row test.
Switching to bulk logged before doing a select into will solve any logging issues of course.
The thing that frustrates me about the system-table with row_number approach is the way the optimizer insists on adding row count spools to the plan. The constant scan solution has something like 8 cross joins, but performs very slightly better because row count spools are not added.
Producing a pure cross-join plan from system table inputs has eluded me thus far; this is a shame because I fancy it would be just as fast as the constant scan plan, but with much more compact code. It's all a bit academic really, for all the points I mentioned before, but nevertheless, it is frustrating.
You will be pleased to hear that my attempts to produce a faster CLR-based version failed miserably 🙂
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
October 3, 2009 at 10:28 am
Lynn Pettis (10/2/2009)
Cool. It was released in 1962 and ran initially on the PDP-1. Was used by the FE's when setting up the PDP-1 as a final test before turning over the systems to the customer (if you can believe Wikipedia). Pong didn't come out until 1972.
The weird thing is, that what I saw and played, shouldn't have been there. As best as I can reconstruct it, it was at a penny arcade on the boardwalk in Ocean City MD, in the summer of either 1970 or 1971, when I was 12 or 13. It was 3x as expensive as the other games and not very popular, sitting mostly unnoticed in a corner, but I couldn't take my eyes off of it. I saw some older kids try to play it and walk away in disgust. I used all of my spending money (75 cents, I think) to play one game. The controls were exotic (two joyticks and some buttons) and it was over before I could really figure out what I was supposed to do (turns out it was a two-player game). All-in-all a pretty dissapointing experience, but it was all I could think about for days.
For years afterward, I puzzled over what the heck it was, though by my senior year in high school (74-75) Pong was in practically every other corner store, so I figured it was just some early computer video game that I had never heard. By 1980, I had seen Computronics(sp?) re-packaged Spacewar, and I knew that that was the game that I had played, but not what the story behind it was.
In the 80's and 90's I took advantage of the Arpanet and the Usenet groups to find out more (Byte magazine was a big help too). But this is where it gets weird, because what I saw in 1970 shouldn't have existed, and certainly not where I saw it. Because according to the official history, up until 1972, there were no arcade units. Indeed officially, there were only two built up until that time, both Lab Computers at MIT, and still in that area. No arcade units would be built until supposedly 1972 as "Galaxy Game", and even then only a few units were ever built and it's not even clear if any of them ever made it into any arcades.
Still, I saw what I saw and I did play that thing way back in 1970. And the features I remember obsessing over in high-school (gravity, hyperdrive, the fuzzy gray and white fixed starfield, the square looping geometry) were way to specific to be either a coincidence or something that I confabulated later.
I only ever found 2 partial confirmations of what I saw, both from the Usenet groups in the late 80's and early 90's. One a guy who said that he too remembered a video game almost exactly like Spacewars at an arcade on the eastern shore, at about the same time. And another from someone who claimed to have some peripheral connection to the Spacewars developers/owners, etc., in Mass, who said that he thought that there may have been some Arcade units out there in the late 60's or early 70's before Galaxy Game was released.
So my best guess as to what it was, is that it may have been that some prototype units were made, either by the Galaxy Game folks for test marketing, or by the original Spacewar owners for promotional purposes (apparently they tried doggedly to sell or license it to an arcade manufacturer up until the early 80's), and then distributed to high-profile sites, like that arcade on the boardwalk in Ocean City.
I guess I'll never really know for sure though... :satisfied:
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
October 3, 2009 at 1:42 pm
Barry, for some reason the music from "Twilight Zone" keeps running through my head now.
__________________________________________________
Against stupidity the gods themselves contend in vain. -- Friedrich Schiller
Stop, children, what's that sound? Everybody look what's going down. -- Stephen Stills
October 3, 2009 at 5:19 pm
It were alien devices I tell you...all we got now is reverse engineered 😉
October 4, 2009 at 9:48 am
Jeff Moden (9/22/2009)
That's nice, but let's see you use that in a join or even as a stand-alone result set.
A table valued function will cover this if you don't want the variable exposed in the current scope.
October 4, 2009 at 9:54 am
Clearly you did not understand the code. COUNT returns one row. My code returns the vector 0..N-1 based on the size of the target table.
the sqlist (9/22/2009)
laughingskeptic (9/22/2009)
One usualy does not need a tally table larger than the largest table. In which case the following works as a generator:declare @i bigint
set @i = 0
select @i=@i+1 as num from largest_table
What happened to the good old count:
declare @i bigint
select @i=count(*) from largest_table
October 4, 2009 at 1:04 pm
laughingskeptic (10/4/2009)
Clearly you did not understand the code. COUNT returns one row. My code returns the vector 0..N-1 based on the size of the target table.
I've got to say - I think it's you who did not understand the code. Firstly, it's a scalar variable, secondly the other code does exactly the same, just minus the rbar.
Of course, it would be far more efficient to query sysindexes for the largest value anyway, then add maybe 5% to that number to account for the innaccuracies inherent therein.
Atlantis Interactive - SQL Server Tools
My blog[/url]
Why I wrote a sql query analyzer clone
October 4, 2009 at 4:14 pm
Matt Whitfield (10/4/2009)
laughingskeptic (10/4/2009)
Clearly you did not understand the code. COUNT returns one row. My code returns the vector 0..N-1 based on the size of the target table.I've got to say - I think it's you who did not understand the code. Firstly, it's a scalar variable, secondly the other code does exactly the same, just minus the rbar.
Of course, it would be far more efficient to query sysindexes for the largest value anyway, then add maybe 5% to that number to account for the innaccuracies inherent therein.
Agreed 🙂
The posted code does not return a 'vector' (set?) at all, it just increments the value of @i once for every row. That was the basis for sqllists very reasonable question.
laughingskeptic, you need to be very sure that what you are posting is bullet-proof before making such bold statements. Otherwise, you risk coming off as an arrogant twit. :laugh:
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
October 4, 2009 at 4:16 pm
My first addiction was to the cartridge version (woot) of Galaxian on the VIC-20. Awesome game.
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
October 5, 2009 at 3:38 am
The first game I played was BMX Simulator for the C64 I think... Or maybe it was Way of the Exploding Fist... either way, I don't really have a 'claim to fame' on that front!
I did start programming when I was 6 though! 😀
Atlantis Interactive - SQL Server Tools
My blog[/url]
Why I wrote a sql query analyzer clone
October 5, 2009 at 5:42 am
laughingskeptic (10/4/2009)
Jeff Moden (9/22/2009)
That's nice, but let's see you use that in a join or even as a stand-alone result set.
A table valued function will cover this if you don't want the variable exposed in the current scope.
So, let's see the code to do that.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 5, 2009 at 7:47 am
Can't remember the first game I played. It would have been on a Vic20 at a friends house because we couldn't afford computers.
My first introduction to anything resembling programming was at 7 or 8. Turtle Program for the BBC Micro. We had it at school and had the robotic turtle and everything. It was Ace. Mum got a BBC shortly after and I'd spend the evenings prepping scripts of complex patterns to take into school and draw with the robot.
[font="Arial"]_______________________________________________________
Change is inevitable... Except from a vending machine.[/font]
October 5, 2009 at 8:16 am
For me it was Hunt the Wumpus, hand-coded on my TRS-80 Model III circa 1979. I think there was another one I did, something that started with a 'Z' - Zork, Zon? BTW, that "computer" had 4K of RAM and CASSETTE TAPE DRIVE EXTERNAL STORAGE - and I was the baddest *** geek for a hundred miles (small-town NC at the time). Oh, did I mention that my parents forked over $999 for it?!?! Man have we come a damn long way!
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
October 5, 2009 at 8:24 am
Data Error! Please rewind and try again.
[font="Arial"]_______________________________________________________
Change is inevitable... Except from a vending machine.[/font]
October 5, 2009 at 8:36 am
I started programming in Junior High. Not even on a "real" computer, an HP 9810 (a glorified programmable calculator) that my grandfather purchased for his medical lab. First was just modifying existing code to not waste paper. Then I actually started writing some original code for him based on the algorithims he provided. He even provided test data and expected results. He was doing it right and I wasn;t the one teaching him, he taught me.
Edit: Here is some info about the HP 9810.
Viewing 15 posts - 136 through 150 (of 159 total)
You must be logged in to reply to this topic. Login to reply