April 4, 2012 at 5:09 am
Hi All,
I have a Question which may seems to be naive one but if anyone of the gurus can answer would clear my understanding.
I have a Query like the below
Select *,
( select count(*) from Table A1
where A1.col1 = A.col1
)
from Table A
Inner Join Table B
on a.col1=b.col1
this is just a prototype of my query i am working on, say this A table is huge , so i wanted to ask is if i use a table multiple times on a Query will that table accesses multiple times or the query engine finds some way to tune the multiple access to the same table.
Again i ran the query with SET STATISTICS ON there i don't see multiple reference to that table. Also in the execution plan i don't see that.
I would be thankful if you guys can spend some time ob this
Regards
Asit
April 4, 2012 at 5:38 am
While that query is simple, as you've illustrated, the optimizer is probably going to be able to rearrange the structure so that you're getting a good clean JOIN. But, as the complexity increases, I suspect you'll see performance go down. Instead of a correlated sub-query as you currently have, turn it into a derived table and JOIN against it. That will result in a more consistent execution plan, and possibly better performance.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 4, 2012 at 7:14 am
understand what you are saying but why i am not getting anything in the Execution plan or STATISTICS IO. Does they aggregate the cost and then show??
The example again is simple but there a number of counts in that query which is using the correlated query joining that huge table with other tables. In that case u suggest i should use the derived table .
Please suggest as i have to tune the code and that part is actually taking most of the IO
REgards
Asit
April 4, 2012 at 7:23 am
I don't understand the statement "not get anything in the execution plan." What are you expecting to get? The values in the execution plan regarding cost are only estimates based on statistics. They're not actual measures of performance.
As to the STATISTICS IO, those are a measure of the reads and scans. Looking at them will tell you how many reads are being done against the given tables. But I'm unsure how you're using them in that you're "not getting anything" again.
Another option worth exploring is using a CROSS APPLY instead of a JOIN. Basically the same idea as the derived table, but instead applying it to the other tables on a per row basis. It's similar in logical processing to a correlated sub-query, but much more efficient in execution.
If your actual query involves lots and lots of correlated sub queries, yeah, I'm not surprised if you're seeing poor performance.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply