May 22, 2009 at 2:51 pm
Goldie Graber (5/22/2009)
For some reason when I use GetDate() I always get zero.When I use the view I get some value, usually under 100 ms.
I'm thinking that's because SQL Server dates are only accurate up to 1/300 of a second.
Yes, that's right. That is one reason why we like to do performance tests with a thousand or a million rows/iterations, it gets the measurements well above the accuracy/precision thresholds.
If your concerns is accurately measuring the elapsed times of production procedure activations, that are only going to be called one at a time, then if you collect enough instances you can usually do statistical averaging accurately. For instance, procedure spProcA is called in production one million times and records elasped time of 0ms, 700,000 times and 3ms 300,000 times, then when may conclude that it runs in 0.9ms with reasonable accuracy.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
May 23, 2009 at 8:40 am
{approaching sensitive problem very carefully} If I understand correctly, this is a stored proc that processes a single row? If it's used in conjunction with a GUI, that's probably ok. If it's used as part of a batch process, I'll suggest that the calling or parent procedure should probably be rewritten to get the levels of performance up a bit (maybe, a lot).
Of course, that may be why Goldie is trying to measure the performance... trying to prove that a rewrite is necessary.
If so, then as Barry suggests, you have to measure it's performance over time and having the limit of 1/3rd of second resolution with GETDATE() in 2k5 is likely going to be a problem. If you want to know just what the proc is doing for performance, then you'll like need to setup an SQL Server Profiler trace to catch the duration stats (might as well include CPU time, Reads, Writes, and Rowcounts while you're at it) in a table. Displaying the trace to the screen probably won't do it because it's displayed as milli-seconds. Capturing to a trace table will do it at the micro-second level.
If you're careful about the settings of the trace, the information gathered will be neither overwhelming nor will it interfere with performance to any great extent. To wit, I'd setup the trace to capture SP:Completed even class with a filter on the ObjectID column to match the Object_ID of the stored procedure. That way, only information about that particular stored procedure will be captured and can very easily and quickly be analyzed.
Of course, if you want to include the performance hit of actually calling the stored procedure multiple time, it should be fairly easy to setup a While Loop to call it multiple times. If you only need duration, then starting and ending a GETDATE() timer outside the loop will suffice. If you want more information like what the total CPU time, Reads, Writes, and Rowcounts would be, then setting up a different SQL Server Profiler Trace to for SQL:Batch Completed with a filter for the particular SPID you're doing the testing from would provide just exactly the information for that experiment.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 23, 2009 at 10:27 am
Jeff: this is a typo, I think:
Jeff Moden (5/23/2009)
If so, then as Barry suggests, you have to measure it's performance over time and having the limit of 1/3rd of second resolution with GETDATE() in 2k5 is likely going to be a problem.
I know that you know it's 1/300th of a second, but other readers might get confused. 🙂
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
May 23, 2009 at 11:02 am
Jeff Moden (5/23/2009)
If I understand correctly, this is a stored proc that processes a single row? If it's used in conjunction with a GUI, that's probably ok. If it's used as part of a batch process, I'll suggest that the calling or parent procedure should probably be rewritten to get the levels of performance up a bit (maybe, a lot).
Yup. For batch processing you need to query "sys.dm_exec_sessions" instead of "sys.dm_exec_requests". But if the task is to measure batch processing I would also suggest to use SQL Profiler. The procedures I posted are just helpers to detect performance issues within one batch. Profiler is much more powerful and easier to use.
May 23, 2009 at 12:24 pm
Florian Reischl (5/23/2009)
...For batch processing you need to query "sys.dm_exec_sessions" instead of "sys.dm_exec_requests".
Could you explain that some more Florian? What kind of situation would you need to measure more than one batch in the same session?
But if the task is to measure batch processing I would also suggest to use SQL Profiler. The procedures I posted are just helpers to detect performance issues within one batch. Profiler is much more powerful and easier to use.
I'd agree with "powerful", but I'd have to disagree with "easy to use". I've been doing performance management professionally for more than 20 years, and SQL Profiler is definitely one of the least easy to use performance collection tools that I have ever used. This is a problem endemic to Microsoft though, even the good ones (like PerfMon) aren't very good, and suffer from a lot of inattention.
Now, it is true that, although event-based performance collectors are usually the most powerful in terms of their ability to collect detailed accurate information, they are also almost always problematic from a usability standpoint (they also tend to have issues with saturation dropouts and storage space).
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
May 23, 2009 at 12:34 pm
RBarryYoung (5/23/2009)
Jeff: this is a typo, I think:Jeff Moden (5/23/2009)
If so, then as Barry suggests, you have to measure it's performance over time and having the limit of 1/3rd of second resolution with GETDATE() in 2k5 is likely going to be a problem.I know that you know it's 1/300th of a second, but other readers might get confused. 🙂
Definitely a typo due to lack of caffeine. 😛
--Jeff Moden
Change is inevitable... Change for the better is not.
May 23, 2009 at 12:41 pm
Hi Barry
RBarryYoung (5/23/2009)
Florian Reischl (5/23/2009)
...For batch processing you need to query "sys.dm_exec_sessions" instead of "sys.dm_exec_requests".Could you explain that some more Florian? What kind of situation would you need to measure more than one batch in the same session?
Maybe I explained wrong.
I mean if you have a large script or procedure which does not perform very well "sys.dm_exec_request" is a nice source to get information along the execution.
As Jeff mentioned if Goldie tries to use sys.dm_exec_request outside the script/procedure it will return either zero or just the execution information for this call. The sys.dm_exec_session will show the complete concatenated resource usage information for the current session.
But if the task is to measure batch processing I would also suggest to use SQL Profiler. The procedures I posted are just helpers to detect performance issues within one batch. Profiler is much more powerful and easier to use.
I'd agree with "powerful", but I'd have to disagree with "easy to use". I've been doing performance management professionally for more than 20 years, and SQL Profiler is definitely one of the least easy to use performance collection tools that I have ever used. This is a problem endemic to Microsoft though, even the good ones (like PerfMon) aren't very good, and suffer from a lot of inattention.
Now, it is true that, although event-based performance collectors are usually the most powerful in terms of their ability to collect detailed accurate information, they are also almost always problematic from a usability standpoint (they also tend to have issues with saturation dropouts and storage space).
Heh... Maybe you are correct. I think the complexity of those tools is based on the task to count performance. But Profiler is not so complicated for basic performane tasks in my opinion. Maybe I didn't use it enough to determine the real problems.
Greets
Flo
Edited: Corrected quoting
May 23, 2009 at 12:42 pm
Florian Reischl (5/23/2009)
Jeff Moden (5/23/2009)
If I understand correctly, this is a stored proc that processes a single row? If it's used in conjunction with a GUI, that's probably ok. If it's used as part of a batch process, I'll suggest that the calling or parent procedure should probably be rewritten to get the levels of performance up a bit (maybe, a lot).Yup. For batch processing you need to query "sys.dm_exec_sessions" instead of "sys.dm_exec_requests". But if the task is to measure batch processing I would also suggest to use SQL Profiler. The procedures I posted are just helpers to detect performance issues within one batch. Profiler is much more powerful and easier to use.
Sorry, Flo... didn't even look at your good stored procs. I went straight for the profiler in this case. Was actually more concerned that someone may have written a RBAR stored proc in the quote you cited.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 23, 2009 at 12:45 pm
Jeff Moden (5/23/2009)
Florian Reischl (5/23/2009)
Jeff Moden (5/23/2009)
If I understand correctly, this is a stored proc that processes a single row? If it's used in conjunction with a GUI, that's probably ok. If it's used as part of a batch process, I'll suggest that the calling or parent procedure should probably be rewritten to get the levels of performance up a bit (maybe, a lot).Yup. For batch processing you need to query "sys.dm_exec_sessions" instead of "sys.dm_exec_requests". But if the task is to measure batch processing I would also suggest to use SQL Profiler. The procedures I posted are just helpers to detect performance issues within one batch. Profiler is much more powerful and easier to use.
Sorry, Flo... didn't even look at your good stored procs. I went straight for the profiler in this case. Was actually more concerned that someone may have written a RBAR stored proc in the quote you cited.
Nothing to apologize, Jeff 🙂
May 26, 2009 at 10:45 am
Jeff Moden (5/23/2009)
{approaching sensitive problem very carefully} If I understand correctly, this is a stored proc that processes a single row? If it's used in conjunction with a GUI, that's probably ok. If it's used as part of a batch process, I'll suggest that the calling or parent procedure should probably be rewritten to get the levels of performance up a bit (maybe, a lot).Of course, that may be why Goldie is trying to measure the performance... trying to prove that a rewrite is necessary.
If so, then as Barry suggests, you have to measure it's performance over time and having the limit of 1/3rd of second resolution with GETDATE() in 2k5 is likely going to be a problem. If you want to know just what the proc is doing for performance, then you'll like need to setup an SQL Server Profiler trace to catch the duration stats (might as well include CPU time, Reads, Writes, and Rowcounts while you're at it) in a table. Displaying the trace to the screen probably won't do it because it's displayed as milli-seconds. Capturing to a trace table will do it at the micro-second level.
If you're careful about the settings of the trace, the information gathered will be neither overwhelming nor will it interfere with performance to any great extent. To wit, I'd setup the trace to capture SP:Completed even class with a filter on the ObjectID column to match the Object_ID of the stored procedure. That way, only information about that particular stored procedure will be captured and can very easily and quickly be analyzed.
Of course, if you want to include the performance hit of actually calling the stored procedure multiple time, it should be fairly easy to setup a While Loop to call it multiple times. If you only need duration, then starting and ending a GETDATE() timer outside the loop will suffice. If you want more information like what the total CPU time, Reads, Writes, and Rowcounts would be, then setting up a different SQL Server Profiler Trace to for SQL:Batch Completed with a filter for the particular SPID you're doing the testing from would provide just exactly the information for that experiment.
The stored procedure is creating a dynamic query and then executes it.
I think that creating the actual dynamic query in the database is more expensive than generating it in the application itself.
I just want to store off the time it takes to build the dynamic query.
I am not looking to store total execution time of the procedure.
Viewing 10 posts - 16 through 24 (of 24 total)
You must be logged in to reply to this topic. Login to reply