July 23, 2008 at 6:56 am
Hi,
Ive got a database, with alot of business logic spread over numerous stored procedures, each calling each other, to manipulate data. They're also very old stored procs, so some of them are up to 1000 lines, containing lots of cursors.
What I'm after is a good way of analysing where my performance issues are?
If I run SQL Profiler with the default settings, it doesnt show which stored procedures are being called, only the first initial call.
Ive looked at using Performance counters on the server, and they are helpful, but again, dont pinpoint where the problems are.
If I could get SQL Profiler to show this stored proc was called, it then called this stored proc, followed by this, etc etc, along with timings, that would be a great help.
Is this possible?
Kind regards
Andy
July 23, 2008 at 9:06 am
There is a way to do this, but you need to be very careful about how you use it. I'd suggest, if possible, only running it against a test system instead of your production system. In fact, the best mechanism would be to identify the long running wrapper procs first, then do what I'm about to suggest only on those wrapper procs and only on a test system.
You can simply capture statement completion. This shows the same thing as rpc or sql batch completion but on an individual, statement by statement basis. The reason you need to be careful where you run this is because it will capture a lot of information and on a production system this is one place where you might see Profiler seriously impact performance. Plus, on a production system, you might fill the Profiler buffer with statements such that it simply dumps the excess. That could make you miss the important stuff.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
July 23, 2008 at 9:41 am
Andy, performance issues on SQL Server can be tricky to track down. Starting with Profiler is probably the best option. Typically when I do not know which procedure is causing the problem I start by capturing a general trace including DURATION, CPU and READS/WRITES. I usually filter the capture to anything that takes more than 2 seconds to run (2000 ms in the duration field), takes more than 5,000 reads to minimize potential performance implications of tracing.
This allows me to later group statements in Excel to see which ones are generating the most Reads, those that take the longest, those that are the most CPU intensive and those that run the most. If you have a statement that takes 1 million reads, but it runs only once, you may look elsewhere to see if a smaller query that generates 10,000 reads is being called 1,000 times. This is the "macro" approach.
Then take your top 10 offenders and use Grant's suggestion to track what's happening within the stored procedure, one at a time. Again, you may need to group your statements to see what's going on in general, but at this point you may find some offenders and focus on those.
Herve Roggero
hroggero@pynlogic.com
MCDBA, MCSE, MCSD
SQL Server Database Proxy/Firewall and Auditing
July 23, 2008 at 9:51 am
Andy,
Good advice from both Grant and Herve. You should also know that you can use Profiler to create the trace, script it, and then run it on the server with it writing to a file. This eliminates the issue Grant mentions about filling the Profiler buffer and has a much smaller impact on performance. ONce you have the trace created on the server you can start and stop it with 1 command sp_trace_setstatus [trace_id], [status] (0 -Stop, 1- Start, 2- Stop and delete).
In addition to what Herve mentions about filtering, you should also filter on the database as well, so that you are not getting any of the internal SQL Server processes.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
July 23, 2008 at 10:22 am
Just a point of detail... You can overflow the buffer even when capturing to file. It just takes a heck of lot more to do that since writing out to file is so much faster.
However, great advice. I didn't think of it.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
July 23, 2008 at 1:28 pm
Thanks for the input. Unfortunately the Statement Completed still doesnt help me.
The problem is that the trace only shows one entry (ie the original stored proc being called).
In actually fact, there's a request table full of request data. The stored proc opens a cursor on this table of data, and for each record, starts performing calculations and calling other stored procedures. And eventually populating a response table. Im trying to figure out whether its a cursor problem, or a bottleneck in another stored proc, etc.
But the profiler is just showing one entry for the initial call, nothing else. Turning on Statement Completed didnt help either.
I was hoping to be able to see when a sub stored proc is being called, and its response time.
July 23, 2008 at 1:37 pm
Statement complete should have choked you out with information, especially if you're running cursors. So you went into TSQL and turned on SQL:StmtCompleted and into Stored Procedures and enabled SP:StmtCompleted? And you didn't get all kinds of internal calls within stored procedures? Man, I sure do. I just rechecked it. On a small proc I've got nine individual statements from the internals.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
July 23, 2008 at 2:04 pm
I just ran these procs I created to answer another forum question in AdventureWorks:
[font="Courier New"]ALTER PROCEDURE temp_table_test_1
AS
CREATE TABLE #test
(
test_id INT IDENTITY(1,1),
test NVARCHAR(50)
)
INSERT INTO #test
(
test
)
SELECT
name
FROM
production.product
SELECT * FROM #test
EXEC temp_table_test_2
SELECT * FROM #test
RETURN
GO
ALTER PROCEDURE temp_table_test_2
AS
IF OBJECT_ID('tempdb..#test') IS NOT NULL
BEGIN
UPDATE #test
SET test = 'Jack Corbett'
WHERE
test_id = 1
END
ELSE
BEGIN
SELECT 'Temp table #test does not exist'
END
RETURN
[/font]
And attached are the results of a profiler session checking for SP:StmtCompleted and filtering on the DataBaseName and it shows each SQL statement within each SP. I added DatabaseName to the output of the Standard Template and removed all events by SP:StmtCompleted.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
July 23, 2008 at 8:25 pm
Grant Fritchey (7/23/2008)
Just a point of detail... You can overflow the buffer even when capturing to file. It just takes a heck of lot more to do that since writing out to file is so much faster.However, great advice. I didn't think of it.
Grant,
I thought I had read this, but did not have the book handy earlier, so now I finally had the chance to look it up. According to Inside SQL Server 2005: Query Tuning and Optimization on page 59:
That only applies to writing to a server side file. Writing to a client side file does not use the file provider, it uses the rowset provider. (Also from Inside SQL Server 2005: Query Tuning and Optimization chapter 2).
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
July 24, 2008 at 4:38 am
Edit:
I think Ive got it now. I was selecting the TSQL:Statement Completed event instead of SP:Statement Completed Event.
Fantastic. I see what you mean about the output.
Thanks for your help everyone.
July 24, 2008 at 7:05 am
Interesting Jack, thanks for the info. I was under the impression that it was possible to lose data, regardless of the mechanisms...
So, if I start the service on the server in question through a TSQL script and output to a networked file server, that satisfies the criteria?
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
July 24, 2008 at 7:31 am
Grant,
Good question. I found the chapter of the book on MSDN on-line and here is the paragraph:
"The file provider is designed with a guarantee that no event data will be lost. To make this work even if an I/O slowdown or stall occurs, the internal buffers begin to fill if disk writes are not occurring quickly enough. Once the buffers fill up, threads sending event data to the trace begin waiting for buffer space to free up. In order to avoid threads waiting on trace buffers, it is imperative to ensure that tracing is done to a fast enough disk system. To monitor for these waits, watch the SQLTRACE_LOCK and IO_COMPLETION wait types."
So you could do it, but on a busy server you probably want to write to a local disk or a fast SAN.
Here is the link to the chapter on MSDN: http://msdn.microsoft.com/en-us/library/cc297236.aspx
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
July 24, 2008 at 8:02 am
Thanks for that. I love these forums. I learn something new every single day.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
July 24, 2008 at 8:06 am
I'm with you on that. That's why I try to post regularly. So I can learn something.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply