August 13, 2012 at 10:17 pm
Hello,
In my organization Excel reports are query driven, but these queries are executed with an EXEC sp statement. I am not interested in sps that are being called by applications other than Excel....
I would like to know how to go about discovering which stored procedures are being run against a database, how often, and how long they run. The goal is to build a priority list of sps that need optimization.
Will the forum kindly advise on the techniques for doing this?
--Quote me
August 14, 2012 at 1:43 am
Run a trace for the application in question. You say excel, now depending on the version depends on the application name.
We run 2007 so I would trace for the application name "2007 Microsoft Office system", 2003, 2010 will have a different name.
Would say you only need the "tuning" template as all you want to see if how often a SP is called, then you can do your aggreagates on the trace.
August 14, 2012 at 4:48 am
Same answer as Anthony, but since you're on 2008, I'd strongly suggest using extended events instead of trace. They cause much less impact on the server and you can do the same things as trace only they're easier to set up and maintain.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
August 14, 2012 at 8:05 am
I may be thinking along the wrong lines but I think that you can use the dmv's to get this info utilising the cached plans for 'adhoc/prepared' plans. From this you can get the usage stats and the statement run but only on SQL that has been run against the db. It may need a bit of tweeking but try:
--Written By E Howes
SELECT
cp.objtype
,qs.[creation_time]
,qs.[last_execution_time]
,qs.[execution_count]
,qs.[total_worker_time]
,qs.[last_worker_time]
,qs.[min_worker_time]
,qs.[max_worker_time]
,qs.[total_physical_reads]
,qs.[last_physical_reads]
,qs.[min_physical_reads]
,qs.[max_physical_reads]
,qs.[total_logical_writes]
,qs.[last_logical_writes]
,qs.[min_logical_writes]
,qs.[max_logical_writes]
,qs.[total_logical_reads]
,qs.[last_logical_reads]
,qs.[min_logical_reads]
,qs.[max_logical_reads]
,qs.[total_clr_time]
,qs.[last_clr_time]
,qs.[min_clr_time]
,qs.[max_clr_time]
,qs.[total_elapsed_time]
,qs.[last_elapsed_time]
,qs.[min_elapsed_time]
,qs.[max_elapsed_time]
,qs.[total_rows]
,qs.[last_rows]
,qs.[min_rows]
,qs.[max_rows]
,t.text
FROM
sys.dm_exec_cached_plans cp
INNER JOIN [sys].[dm_exec_query_stats] qs ON qs.plan_handle = cp.plan_handle
CROSS APPLY sys.dm_exec_sql_text(cp.plan_handle) t
WHERE
cp.objtype NOT IN ('proc', 'trigger') -- Not cached SP's or triggers just prepared\Adhoc SQL
ORDER BY
qs.last_worker_time DESC --Whatever your metric is here
SQL DBA
Every day is a school day, and don't trust anyone who tells you any different.
http://sqlblogness.blogspot.co.uk
August 14, 2012 at 8:19 am
The query stats DMV is fine if you just want a rough idea. It only returns data for queries whose plans are still in cache, so any that were never cached or have been removed from cache either won't show or will show inaccurate execution counts.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
August 14, 2012 at 8:25 am
Extended Events, Profiler traces and querying the DMVs are all probably better options but I just wanted to mention for all the noobs out there that there is also a built-in report for this in 2008 called Object Execution Statistics that may be helpful.
The previously mentioned options are probably better but this exists.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply