July 28, 2020 at 4:28 pm
In a specific Query, how can you easily identify how long each piece of code is taking? My query is taking over 20 minutes so I can't just comment out each line and see what improves it, so I need a better method.
July 28, 2020 at 6:12 pm
Look at the execution plan.
Also run it with:
SET STATISTICS IO, TIME ON
July 28, 2020 at 7:43 pm
another option is to use the great free tool SentryOne Plan Explorer:
https://www.sentryone.com/plan-explorer
It gives you a lot more information than the standard execution plan in Management Studio, including actual I/O, actual CPU time, and lets you see each individual operation and the possible indexes that could meet that operation.
July 28, 2020 at 8:18 pm
In a specific Query, how can you easily identify how long each piece of code is taking? My query is taking over 20 minutes so I can't just comment out each line and see what improves it, so I need a better method.
The Actual Execution plan can provide some hints although it's not always "accurate" because of all the estimates it makes even in the "Actual" Execution plan.
For long winded code, I always build it with a "Messaging" parameter where if I set it to something other than "0", it'll spit out the time (duration, which is the ultimate indication of problems so far as users are concerned) and row counts of each section. Of course, you have to modify the code to do that but it'll help you find the real issue(s).
--Jeff Moden
Change is inevitable... Change for the better is not.
July 29, 2020 at 11:31 am
I assume we're talking about a batch here with multiple statements?
In that case, put Extended Events to work. You can create a session using the sql_statement_completed event for batches, or sp_statement_completed for stored procedures. These will enable you to capture each individual statement of the batch and then you can tell how long each one took to run.
Now, sp_statement_completed can be easily filtered by object_id. sql_statement_completed can't as easily be filtered. So, be very cautious in how you use the session if you're looking at sql_statement_completed. Turn it on, run the query, turn it off. You'll still have a lot of data. One way to simplify the data collection and consumption would be to enable causality tracking in the session. Then, all the statements will be grouped. I have a bunch of examples doing exactly this kind of work on my blog.
Then, once you identify the offending statement, as everyone else has said, look at the execution plan. If you need help on that, look at the link in my signature.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply