May 28, 2007 at 4:56 pm
hi,
came up with this code in 10 min
declare @x int
set @x = 1
while @x<=100
begin
If (@x%3) = 0 and (@x%5)= 0
begin
Select @x, 'BizzBuzz'
goto final
end
if (@x%3) = 0
begin
Select @x, 'Bizz'
goto final
end
if (@x%5) = 0
begin
select @x, 'Buzz'
goto final
end
select @x, ''
final:
set @x = @x + 1
end
what do u think
May 28, 2007 at 5:03 pm
In fact, the worst programmer in our company is the only one who's got MS certification. Funny? Or not really? |
No, not really funny ... our two worst SQL programmers (they were absolutely awful) also had MS certs ... they did know how to write loops our the wazoo, though
--Jeff Moden
Change is inevitable... Change for the better is not.
May 28, 2007 at 5:06 pm
Goto???
--Jeff Moden
Change is inevitable... Change for the better is not.
May 28, 2007 at 5:15 pm
May 28, 2007 at 5:42 pm
>... the total run time is 8.610 seconds (NOT a type-o... less than 9 seconds and a good 20 times (2000%) faster than the fastest >loop code)
Jeff's response was thorough . . . and only contained one problem. A solution that runs in 1/20th of the time required for another solution is not 2000% faster. Technically, the slower one may be said to <u>require 20 times as much time</u>. If the faster solution is 100% faster then that would mean that it took 100% less time than the other . . . in other words, it would have happened in 0 time. I cannot imagine what "2000% faster" mena . . . other than, possibly, it would have completed 19 times the slower one's time requiredment <u>before</u> it started. 😉
May 28, 2007 at 5:45 pm
I didn't completely dismiss WHILE loops... take a look at the end of my first post.
And of course no one would spend as much time during an interview as we have on this thread
Not doing "premature optimization" (I just call it writing good code to start with) is what got the folks at the company I work for into so much trouble... everyone unit tested for a 100 or a 1000 rows and did nothing about anticipating the obvious growth to millions of rows not to mention the average of 640 deadlocks per day And, no, of course we're not talking about shaving a few microseconds... we're talking about hours of difference in the face of scalability. Since they wrote it all as RBAR, there's nothing they can do with hardware, indexes, or any other "quick fix" to speed up the code... they have to redesign/rewrite it... that's not effective, at all. And, since there's no embedded documentation in the code, the code has to be studied and analyzed to figure out what it's doing before they can even begin to write requirements for the code. Would have been much better to do a little "premature optimization" and a little documentation up front because it's costing a heck of a lot of money, now.
Admittedly, the BizzBuzz test is a bit of an insult to everyone's sensibilities... but if everyone in the company writes code 20 times slower than it could be, just to get the job done, then the customer is really going to be ticked when their data grows.
Good programmers expect growth against their code and know how to write good code for that as quickly or nearly as quickly as the bad programmers that write non-scalable, poorly performing, undocumented code. Bid it that way (good code) and then do it that way...
--Jeff Moden
Change is inevitable... Change for the better is not.
May 28, 2007 at 5:47 pm
Here's another version, and I sure hope it complies with Sergiy's rules so that he can put this code into production on his SQL Server 2000 servers!
select case when n % 15 = 0 then 'bizzbuzz' when n % 3 = 0 then 'bizz' when n % 5 = 0 then 'buzz' else convert(char(9), n) end m from ( select c.n + (d.n * 1000) + 1 from ( select top 1000 a.n + (b.n * 32) from ( select 0 union all select 1 union all select 2 union all select 3 union all select 4 union all select 5 union all select 6 union all select 7 union all select 8 union all select 9 union all select 10 union all select 11 union all select 12 union all select 13 union all select 14 union all select 15 union all select 16 union all select 17 union all select 18 union all select 19 union all select 20 union all select 21 union all select 22 union all select 23 union all select 24 union all select 25 union all select 26 union all select 27 union all select 28 union all select 29 union all select 30 union all select 31 ) a (n), ( select 0 union all select 1 union all select 2 union all select 3 union all select 4 union all select 5 union all select 6 union all select 7 union all select 8 union all select 9 union all select 10 union all select 11 union all select 12 union all select 13 union all select 14 union all select 15 union all select 16 union all select 17 union all select 18 union all select 19 union all select 20 union all select 21 union all select 22 union all select 23 union all select 24 union all select 25 union all select 26 union all select 27 union all select 28 union all select 29 union all select 30 union all select 31 ) b (n) order by 1 ) c (n), ( select top 1000 a.n + (b.n * 32) from ( select 0 union all select 1 union all select 2 union all select 3 union all select 4 union all select 5 union all select 6 union all select 7 union all select 8 union all select 9 union all select 10 union all select 11 union all select 12 union all select 13 union all select 14 union all select 15 union all select 16 union all select 17 union all select 18 union all select 19 union all select 20 union all select 21 union all select 22 union all select 23 union all select 24 union all select 25 union all select 26 union all select 27 union all select 28 union all select 29 union all select 30 union all select 31 ) a (n), ( select 0 union all select 1 union all select 2 union all select 3 union all select 4 union all select 5 union all select 6 union all select 7 union all select 8 union all select 9 union all select 10 union all select 11 union all select 12 union all select 13 union all select 14 union all select 15 union all select 16 union all select 17 union all select 18 union all select 19 union all select 20 union all select 21 union all select 22 union all select 23 union all select 24 union all select 25 union all select 26 union all select 27 union all select 28 union all select 29 union all select 30 union all select 31 ) b (n) order by 1 ) d (n) ) e (n) option (maxdop 1)
--
Adam Machanic
whoisactive
May 28, 2007 at 6:19 pm
Well there's a pretty big difference between what I call "premature optimization" and what you call "good code"; for example, I recently visited a client site where they'd created parameters into all of their stored procedures for control-of-flow branches to shave a column or two off the result set on each call (if @x = 1 select a,b else if @x = 2 select a,b,c ... and so on). Maybe 20 or 30 rows returned on each call to most of these procs, and the columns were generally integers or smallish strings. So we can assume about 100-200 bytes of bandwidth saved per call. Slightly faster? Sure. But does it really make a difference? Probably not in the vast majority of cases, and in this case absolutely not--they weren't even close to saturating their gigabit switch. The "optimization" was just done because they thought it would help, someday (thus "premature"). And both the maintainability cost and chance of accidentally creating a bug skyrockets as a result.
So I just wanted to point out that a little bit faster is not necessarily always better. In some cases it might be a good idea to NOT go as far as you can with optimization. But I agree with you that generally speaking "better" code (written with care by someone who knows what they're doing) will tend to be faster by default...
--
Adam Machanic
whoisactive
May 28, 2007 at 6:43 pm
Heh... I agree... what you described those folks did sounds absolutely insane even for concentrated "performance improvement" efforts. Sounds like someone spent a whole lot of time on the wrong things and, again, I agree... sounds like they made a place for bugs to creep in everytime they need to touch the code.
And, I absolutely agree... just a little bit faster isn't usually worth taking a chance on introducing a new error in tried code. 20 times faster usually is. Your last statement is exactly what I'm talking (ok... ranting ) about and you hit the nail on the head...
--Jeff Moden
Change is inevitable... Change for the better is not.
May 28, 2007 at 6:55 pm
Different number of columns depending on a parameter value?
Well, I guess no optimisation could let that project survive till real amounts of data will arrive.
It will always stay on Access or even Excel capable sets of data.
_____________
Code for TallyGenerator
May 28, 2007 at 9:36 pm
You're right. From your comments, I had assumed you were using a table of your own creation. My mistake!!
However, your query would still fail the test as it only counts up to 99, not 100.
And yes, it definitely is a question loaded for the front end.
May 28, 2007 at 10:04 pm
If I modify my query to output 1,000,000 rows, it runs in 3 seconds and does not peg the single CPU of my laptop.
May 28, 2007 at 10:09 pm
I believe it was Jeff Moden who touched on code readability. As someone that has interviewed many database developers and even hired a few of them, if someone submitted the code as a single line, I would not consider them for a job because I don't have the time to break their bad habits.
May 28, 2007 at 10:13 pm
I would discount any solution that resorted to the CLR for something that T-SQL can do. It's quicker now, but once the infamous SQL CLR memory leak kicks in, your whole system is having to be restarted on a regular basis.
May 28, 2007 at 10:16 pm
What infamous memory leak are you referring to?
--
Adam Machanic
whoisactive
Viewing 15 posts - 31 through 45 (of 309 total)
You must be logged in to reply to this topic. Login to reply