November 13, 2007 at 8:57 am
Matt Miller (11/13/2007)
It's odd - because the CTE is giving me nothing but 1's in the ENGCount (not incrementing).
*Laughs* you might want to look at the query again Matt. It's supposed to only return 1's for ENGCount :hehe:
And I don't doubt you're not since I've seen you use CTE's a lot, but we have others who are about to (use this to "gut" CTE's).
I hope not, I have seen some good uses of CTE's that out perform temp tables or table variables. To be quite honest, I've used all three (temp tables, table variable and CTE's) in many situations. What I've found is that in some cases one is much better than the other two, but in other cases it's the opposite....and so forth....
I believe it just depends on what your doing, and you need to test it with all three and just "See for yourself"
NOW, all of that being said, please don't ask for an example. I don't have one off of the top of my head. When I come across one, I'll post it so we can discuss *cough cough* it then. 😀
______________________________________________________________________
Personal Motto: Why push the envelope when you can just open it?
If you follow the direction given HERE[/url] you'll likely increase the number and quality of responses you get to your question.
Jason L. SelburgNovember 13, 2007 at 9:08 am
I understand, but the first query gives me an incrementing (the inner join)...
but anyway, you're right - sounds like this is somewhere we should be exercising "equine medicine" to it....
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
November 13, 2007 at 10:41 am
Matt,
In the begining of the post we made some modifications to Jason's script to meet the users needs. We changed:
SELECT *
FROM @table t1 CROSS JOIN mynumbers t2
WHERE t2.N <= t1.EngCount
TO:
SELECT CustomerID, EngID, 1 AS [EngCount]
FROM @table t1 CROSS JOIN mynumbers t2
WHERE t2.N <= t1.EngCount
Jason posted the unedited code on page 2, for the comparison and this is probably what through you off.
I hope this helps.
-Adam
November 13, 2007 at 10:51 am
Got it Adam. Must be it:).
By the way - if you were to rely on the execution plan - the CTE is "cheaper". So - processor time is much less, but the disk/data access is so much bigger. Execution time of the tally table was always <1/3 of the exec time of the CTE in THIS case: must be those multiple UNION statements buried in the recursive CTE....
Funny how these metrics can be so hard to use.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
November 13, 2007 at 5:38 pm
I've found is that in some cases one is much better than the other two
In what way? Not necessarily looking for an example, but how were they better?
CTE is better than a derived table if you need to refer to the derived table more than once in a SELECT... but CTE can only be used in the query immediately following. Do you have that problem with Temp Tables? No... they can be used in (if need be) dozens of places in same proc and in "downline" procs, as well.
And, so far, it looks like recursive CTE's are not to die for when it comes to performance... 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
November 13, 2007 at 5:58 pm
but we have others who are about to (use this to "gut" CTE's).
Like me! 😀
Heh... I don't need this thread to gut CTE's... most code examples that use them in other threads have already done that... I just haven't tested them. 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
November 13, 2007 at 6:25 pm
Jeff Moden (11/13/2007)
I've found is that in some cases one is much better than the other two
In what way? Not necessarily looking for an example, but how were they better?
CTE is better than a derived table if you need to refer to the derived table more than once in a SELECT...
*laughs* there's one answer.
And, so far, it looks like recursive CTE's are not to die for when it comes to performance... 😉
I agree with this.
And all that I can say about examples, is that I will make a conscious effort to post them either as a new topic or PM when I find them. I only say this because I honestly can't think of one right now, but KNOW I've come across this in the past.:P
______________________________________________________________________
Personal Motto: Why push the envelope when you can just open it?
If you follow the direction given HERE[/url] you'll likely increase the number and quality of responses you get to your question.
Jason L. SelburgNovember 13, 2007 at 7:16 pm
Jeff Moden (11/13/2007)
but we have others who are about to (use this to "gut" CTE's).
Like me! 😀
Heh... I don't need this thread to gut CTE's... most code examples that use them in other threads have already done that... I just haven't tested them. 😉
Actually - did you see my last post? Execution plan-wise, and processor effort-wise, CTE seems to be the "better choice". Undeniable though that from what I can see - time is NOT on CTE's side (meaning - it took longer to process, since apparently the disk reads are out of control). That's what's so f'ed up about these execution plan diagrams (so 1-dimensional at times).
Still - in a processor-bound scenario, with good disk access (something I didn't have), I don't know.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
November 13, 2007 at 9:14 pm
OK, now it's my turn to throw a stick in the wheel. *maybe*
Wouldn't physical/logical reads and such depend on the platform, disk/file or memory structure you have set up for your SQL instance?
______________________________________________________________________
Personal Motto: Why push the envelope when you can just open it?
If you follow the direction given HERE[/url] you'll likely increase the number and quality of responses you get to your question.
Jason L. SelburgNovember 13, 2007 at 9:40 pm
Actually - did you see my last post? Execution plan-wise, and processor effort-wise, CTE seems to be the "better choice".
Heh... they ALL say that until a job takes 8 or 10 hours to fail 😛
I've found that the estimated execution plan lies... according to both the estimated execution plan and the actual execution plan, which snippet of code below will be the fastest?
SET NOCOUNT ON
DECLARE @Counter INTEGER
SET @Counter = 1
WHILE @Counter <= 2000
BEGIN
SELECT @Counter
SET @Counter = @Counter+1
END
DROP TABLE #MyCount
SELECT TOP 2000
Counter = IDENTITY(INT,1,1)
INTO #MyCount
FROM Master.dbo.SysColumns sc1,
Master.dbo.SysColumns sc2
SELECT * FROM #MyCount
Then, try each snippet... first snippet will also run resources low on a 1GB box...
--Jeff Moden
Change is inevitable... Change for the better is not.
November 13, 2007 at 9:44 pm
Now your just being mean Jeff ... LOL 😀
Correct me if I'm wrong, but isn't your first part nothing more than RBAR and the second true SET Based?
😛
[EDIT/UPDATE] --> I'm not to sure on how SQL handles While Loops compared to true Set based operations
______________________________________________________________________
Personal Motto: Why push the envelope when you can just open it?
If you follow the direction given HERE[/url] you'll likely increase the number and quality of responses you get to your question.
Jason L. SelburgNovember 13, 2007 at 10:24 pm
Absolutely! And to be fair, it IS a really extreme example... but put both pieces of code in a single QA window and run an estimated execution plan on the whole thing... you'll see how much of a lie the execution plan can tell especially if you look at those dumb % of batch numbers that so many well meaning folks mistakenly believe in...
The key is, any time you see pencil thin lines in the execution plan, you might be in trouble performance wise... Not always, but it's worth checking out.
I look at the execution plans (especially the actual execution plan in test) mainly to see how many internal rows I'm generating... put your cursor on the connecting arrows to see the number of rows (estimated plan lies real bad here too). Also, the fatter the arrow, the more rows there are... kinda in a logarithmic fashion, too!
Really good example of a bold face lie when it comes to row counts in the estimated execution plan is where you have a correlated sub-query with a triangular join like so many folks use to make running totals and the like. It only shows the number of rows on the first "internal" pass... not the 50 million rows you might actually be generating during the whole run.
As a side bar... I've found that properly written code usually does run with the shortest duration... think about it... you wanna rattle the CPU and hard disk for an hour or a minute? And the code isn't "sleeping" during the hour long run to do something the same as the minute long run... it's using resources in one form or another... I get a fair amount of resistance from folks on that point (usually the old "Speed doesn't matter, only accuracy does") but I've proven it to myself more than once. Of course, accuracy is paramount, but performance runs a very close second in my book. And what do you think is going to have a better chance at being blocked or deadlocked... code that runs for an hour or code that runs for a minute?
Besides... I'd rather go home and have a cold one instead of staying at work babysitting lame-o code that might fail 12 hours from now 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
November 13, 2007 at 10:32 pm
Well, at least I can go to sleep knowing I wasn't losing my mind, and honestly I think I need to do that now 🙂 ... Sleep, not lose my mind.
'Night all...
______________________________________________________________________
Personal Motto: Why push the envelope when you can just open it?
If you follow the direction given HERE[/url] you'll likely increase the number and quality of responses you get to your question.
Jason L. SelburgNovember 14, 2007 at 2:44 pm
Jeff Moden (11/13/2007)
Heh... they ALL say that until a job takes 8 or 10 hours to fail 😛
I've found that the estimated execution plan lies... according to both the estimated execution plan and the actual execution plan, which snippet of code below will be the fastest?
Then, try each snippet... first snippet will also run resources low on a 1GB box...
Absolutely! It's totally annoying. I also love how a lot of system stored procs, XP's, DBCC calls, don't even make "the list" when comparing branches....
It would be one thing if I had 200 independent RAID chains attached, files and all of the other things required, so that my disk access is not an issue or a non cost...but that more than anything seems to be the issue and usually bears a great deal of "cost" to a process.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
November 14, 2007 at 8:03 pm
Nah, SAN is overrated 😉
Just put in 128 gig RAM and make a ramdrive for 32 gig.
Then put tempdb on ramdrive and hopefully rest of the database could reside in ram too.
//Peso
N 56°04'39.16"
E 12°55'05.25"
Viewing 15 posts - 46 through 60 (of 60 total)
You must be logged in to reply to this topic. Login to reply