October 9, 2007 at 3:52 am
Hi All,
I have run two queries that are returning the same results which is good.
So I have run some stats to find out which is better in performance :
QUERY ONE
CPU time elapsed = 569ms
Execution time elapsed = 3562
QUERY TWO
CPU time elapsed = 1927ms
Execution time elapsed = 4112
However when I run the execution plan it shows when running both queries in the same batch.
QUERY ONE takes 66 percent of the batch cost
QUERY ONE takes 34 percent of the batch cost
Which data should I use to determine which query is better?
Thanks
Chris
----------------------------------------------
Try to learn something about everything and everything about something. - Thomas Henry Huxley
:w00t:
Posting Best Practices[/url]
Numbers / Tally Tables[/url]
October 9, 2007 at 6:26 am
generally the query which takes most % is less efficient. I rarely look at time when tuning, working on io as the most important measure. Once I've sorted the io I may look at time, but generally if you reduce io so time is also reduced.
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
October 9, 2007 at 6:58 am
You should look at the number of scans and reads each one is doing. Time is nice & all, but for most queries, reducing scans or reads leads to reductions in times (not always, your mileage may vary, etc.).
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
October 9, 2007 at 7:51 am
I tend to agree with Colin and Grant, but which is which in the batch %? You have Query One twice.
October 9, 2007 at 8:16 am
If it's the "Estimated" Execution Plan... pay those percentages no mind... it "lies" because it will frequently miss things. For example, only the first pass of a triangular join in a sub-query is considered leaving out all the millions of internal rows that may actually be spawned.
If it's the "Actual" Execution Plan, it's pretty much faithful.
Just to add another opinion, I always consider duration as part of performance tuning... it's usually a pretty good indication as to the number of resources that will need to be consumed by the system, etc. Of course, you also have to consider what will happen to all of that when the scale of tables grows... code that takes a minute today may take hours tomorrow when the table reaches a tipping point. But I've found that, in most cases, when comparing two pieces of known working code, I've found that the one with the shorter duration is also the most effecient for resource consumption.
Ok... time to duck... here comes all the folks that will say "speed doesn't matter" 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
October 9, 2007 at 8:18 am
Sorry about that.
Query one takes 66%
Query two takes 34%
thanks for the advice everyone
----------------------------------------------
Try to learn something about everything and everything about something. - Thomas Henry Huxley
:w00t:
Posting Best Practices[/url]
Numbers / Tally Tables[/url]
October 9, 2007 at 1:18 pm
It is useless to compare queries like this.
Duration may vary up to 100 times and more between 'hot' and 'cold' execution. Cold execution is an execution after DBCC DROPCLEANBUFFERS or server restart.
So provide all values: CPU, Reads, Writes and Duration
October 9, 2007 at 10:30 pm
No... if you're talking about simple duration measurements, it's not useless... you've just gotta make sure they're either both "hot" or both "cold" 😛
But, in the face of blocking, looking a CPU time and the other measurements you suggested, are much, much better.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply