This post is a T-SQL Tuesday Entry, hosted this week by Aaron Nelson on the topic of “Reporting”. (It got a little long. Ordinarily I’d have broken this up into a series and fleshed out individual pieces a bit better, but this touches on most of the general points)
I like babysitter reports. What is a "babysitter" report? It's a report that you schedule to run on a recurring basis that checks things for you. I call them babysitter reports because they can monitor things without you having to worry about it. Every environment has different things that they need to look for. Maybe a certain value found its way into a certain table and you need to take action because of it. Maybe a certain query is on a rampage again and you need to kill it. There are all kinds of things that you know you should keep an eye on that you don't always remember to do. Instead of putting that burden on your memory or calendar, these automated reports do the work for you.
Here I will show you how to create one simple babysitter report. I intentionally chose one of the more complicated ones (CPU Threshold) to note how far it could be expanded upon, but more basic things would not require this level of customization. Here are a few examples of things that you could create babysitter reports for:
- Long Running Queries
- Blocking SP's
- Index fragmentation
- Log Size
- System Info
- Specific entries into tables
The sky is the limit. The same strategies can be used to get information to your users when rare events occur that require immediate action if your system doesn't already provide a means to get this information to them in a timely manner. There are certain reports in my environment that can run for *days* if the wrong parameters are sent to them... and while ideally these would be fixed in other ways, it's good to identify the situations that occur in the interim and take action until that can be accomplished.
Here are a few sample queries for finding queries with abnormally high CPU usage. There are two basic parts to these. The first is the data driven subscription. You want this to be as streamlined as possible. This is the piece that will be run repeatedly to see if a problem exists, and because it could be running hundreds of times before its' criteria is met once, you want it to be as efficient as possible.
/* ============================================================================================= CREATE DATE: 04/12/2010 LAST MODIFIED: 04/12/2010 CREATED BY: SETH PHELABAUM PURPOSE: Data Driven Subscription that monitors for queries using high CPU. ISSUES: Will Notify you Repeatedly. Notes: This can be expanded upon quite a bit. For instance, you could also: Set up a Logging table / set of tables to control how often this notifies you (To stop you from getting multiple emails overnight) Set up a list of excluded sp's Set up a list of different actions depending on the time of day (You could also change the schedule in reporting services) Much more... ============================================================================================= */ CREATE PROCEDURE DDS_HighCPU AS SELECT DISTINCT spid, 'youremailaddress@yourdomain.com' Notify FROM sys.sysprocesses WHERE [dbid] > 4 -- May Need to filter out additional Databases here for your setup and cpu > 10000 -- Adjust to whatever you consider worth knowing about. and cmd <> 'AWAITING COMMAND' -- Don't want to be notified about these. and spid IN (SELECT spid FROM sys.dm_exec_connections DMV_EC WHERE DMV_EC.last_read > DATEADD(mm,-2,GETDATE()) OR DMV_EC.last_write > DATEADD(mm,-2,GETDATE())) -- Another filter to hopefully stop some excess emails
The second part is the actual report query. This can be a bit more in depth and contain all kinds of information that helps you take action based on the event that transpired.
/* ============================================================================================= CREATE DATE: 04/12/2010 LAST MODIFIED: 04/12/2010 CREATED BY: SETH PHELABAUM PURPOSE: Pulls information about queries that use a large amount of CPU ISSUES: Will Notify you Repeatedly. Notes: This can be expanded upon quite a bit. For instance, you could also include: Trace events Blocked Processes system stats (current cpu usage/io etc.) Much more... ============================================================================================= */CREATE PROCEDURE Rpt_HighCPU( @SPID int ) AS DECLARE @sql_handle varbinary(64), @stmt_start Int, @stmt_end Int, @FNGS nvarchar(max), @DBIB nvarchar(4000) SELECT top 1 @sql_handle = sql_handle, @stmt_start = stmt_start, @stmt_end = stmt_end from sys.sysprocesses (nolock) WHERE spid = @spid ORDER BY sql_handle DESC --Or stmt_start DESC SELECT @FNGS = CASE WHEN @stmt_start > 0 THEN SUBSTRING(text, (@stmt_start + 2)/2, CASE @stmt_end WHEN -1 THEN (datalength(text)) ELSE (@stmt_end - @stmt_start +2)/2 END) ELSE [Text] END FROM ::fn_get_sql(@sql_handle) CREATE TABLE #B(eventtype nvarchar(30), parameters int, eventinfo nvarchar(4000)) INSERT INTO #B(EventType, Parameters, EventInfo) EXEC ('dbcc inputbuffer (' + @spid + ') with no_infomsgs') SELECT @DBIB = EventInfo FROM #B SELECT TOP 1 @FNGS FNGS, @DBIB DBIB, cpu, physical_io, memusage, status, nt_username, last_batch from sys.sysprocesses where spid = @SPID ORDER BY sql_handle DESC
We also need something that will put a strain on the server to demonstrate the report in action, so I created this ridiculous little SP to run for a while.
/CREATEPROCEDURE dbo.SillyLongRun AS exec dbo.SillyLongRun2 GO /* ============================================================================================= CREATE DATE: 04/12/2010 LAST MODIFIED: 04/12/2010 CREATED BY: SETH PHELABAUM PURPOSE: To run for a while. ISSUES: Totally Pointless Notes: Header here mainly to demonstrate the usage of stmt_start and stmt_end with fn_get_sql. This thing sucks up resources, so don't run it on a production box. ============================================================================================= exec dbo.SillyLongRun2 */ CREATE PROCEDURE dbo.SillyLongRun2 AS ; WITH -- Tally table Gen Tally Rows: X2 X3 t1 AS (SELECT 1 N UNION ALL SELECT 1 N), -- 4 , 8 t2 AS (SELECT 1 N FROM t1 x, t1 y), -- 16 , 64 t3 AS (SELECT 1 N FROM t2 x, t2 y), -- 256 , 4096 t4 AS (SELECT 1 N FROM t3 x, t3 y), -- 65536 , 16,777,216 t5 AS (SELECT 1 N FROM t4 x, t4 y), -- 4,294,967,296 , A lot Tally AS (SELECT CAST(ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) as bigint) N FROM t5 x, t5 y) -- Change the t3's to one of the other numbers above for more/less rows SELECT N, CAST(N as varchar(30)), DATEADD(ms,n,0) FROM Tally
(You can call the above with exec dbo.SillyLongRun)
As mentioned in the headers, you would ideally keep a log of when you were notified about things. Different alerts could be scheduled to have a different frequency. Perhaps you only want to be notified about certain things once a week, but other things you want to be notified about once an hour until they are taken care of. This is where a logging table comes in. I won’t go into that here, but wanted to mention it.
Now that we have the queries, we need to set up the report. I’m going to assume that you already have Reporting Services set up. Here is a Screenshot of a very basic report that I created to pull in the data.
Once you deploy this report, there are a couple more things you need to do before you can create a data driven subscription for it. The first is to set up a shared Schedule. Log into your reports server (http://localhost/reports) and go to Site Settings at the top. Then click on schedules at the left and New Schedule. For this one I’m just going to create a basic 15m recurring schedule.
Next we need to modify the credentials of the shared data source used for the report. My data source name for the report is SS2K8CL100. To modify it, I go back to Home –> Data Sources –> SS2K8CL100. The below screenshot shows me modifying it to use a windows account.
Now, we’re ready to create the data driven subscription. Rather than explain it in text, I’ve taken screenshots of each step of creating a data driven subscription.
Click Finish and you have your report.
In closing, I’ll note that I had a lot of problems getting Reporting Services to function correctly on my windows 7 installation, so this isn’t as polished as I would have liked. I didn’t get the email working and I forgot to include SPID anywhere on the report (pretty useful piece of information to have)