February 10, 2022 at 11:39 am
I'm testing a long running job and want to stop it after it has been running for, say, 2 hours. How can I do this automatically, i.e. without being around to manually stop it?
February 10, 2022 at 11:40 am
Should have added. The job is simply executing a stored procedure.
February 10, 2022 at 2:37 pm
There's no way to do this within the query, so what you have to do instead is set up a monitoring job. Start it at the same time as the existing job, have it check the status and then, kill the process (I assume you mean for a kill, nothing else I can think of) when it exceeds a time limit. Or, have a second job start 2 hours later, check for the existence of the first, then kill it if it's there. In fact, I like that second idea better.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 10, 2022 at 2:52 pm
Thanks Grant. I presume there are sys tables for jobs and job activity. Not sure if I then do sp_stop_job or KILL process.
February 10, 2022 at 2:53 pm
Create another job that runs every N minutes to monitor it, and execute msdb.dbo.sp_stop_job it if it's been running longer than the desired time.
The msdb.dbo.agent_datetime(run_date,run_time) function can be used to convert the integer run_date & run_time columns to datetime.
You could probably modify the following function (determines if job is running) to take job start time into account to determine if job has been running too long, or use job_id parameter instead of job_name if you prefer.
CREATE FUNCTION [dba].[IsJobRunning]
(
@JobName SYSNAME
)
RETURNS BIT
AS
BEGIN
DECLARE @job_id uniqueidentifier = (SELECT job_id FROM msdb.dbo.sysjobs WHERE name = @JobName);
DECLARE @isRunning BIT = 0;
IF EXISTS ( SELECT *
FROM msdb.dbo.sysjobs_view
INNER JOIN msdb.dbo.sysjobactivity ON
sysjobs_view.job_id = sysjobactivity.job_id
INNER JOIN msdb.dbo.syssessions ON
syssessions.session_id = sysjobactivity.session_id
INNER JOIN (SELECT MAX(agent_start_date) AS max_agent_start_date
FROM msdb.dbo.syssessions
) sess_max ON
syssessions.agent_start_date = sess_max.max_agent_start_date
WHERE
sysjobs_view.job_id = @job_id AND
sysjobactivity.run_requested_date IS NOT NULL AND
sysjobactivity.stop_execution_date IS NULL
)
BEGIN
SET @isRunning = 1;
END
RETURN @isRunning;
END
It's important to join to syssessions: https://sqlstudies.com/2013/09/05/a-t-sql-query-to-get-current-job-activity/
February 10, 2022 at 2:54 pm
Thanks Grant. I presume there are sys tables for jobs and job activity. Not sure if I then do sp_stop_job or KILL process.
I suspect either will work. Testing is your best buddy.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 10, 2022 at 2:57 pm
I have this situation in a few potentially long running clean up jobs. I did not create a second job, I modified the proc to run in a loop and test the start time vs current time. If it hits the time limit, the proc exits.
Michael L John
If you assassinate a DBA, would you pull a trigger?
To properly post on a forum:
http://www.sqlservercentral.com/articles/61537/
February 10, 2022 at 2:58 pm
Great, thanks both. Should be able to do it now. Want to avoid looking in on Monday morning and finding the job is still running!
February 10, 2022 at 3:07 pm
That would actually be easy for me to implement as so happens the proc is a loop - for batch processing with each batch in a transaction. As long as the batch isn't itself too large and taking ages.
February 11, 2022 at 3:18 am
Thanks Grant. I presume there are sys tables for jobs and job activity. Not sure if I then do sp_stop_job or KILL process.
If you stop a job and the actual transaction by the proc is a single 2 hour long process, when the job is killed, you'll get a rollback.
Like Michael John suggests, break up the proc (one of the few good reasons to use a WHILE loop) and have it test the time at the beginning of each section. If there's not enough time for the next section to complete, simply exit the job. You should probably have a log in place or some output from the proc that tells you how far it got. The proc itself should have a parameter that identifies the quitting date/time so a job can pass that time without having to do a change in the proc if that time changes. It would also allow you to let it run with no quitting time by passing '9999" as the quitting date/time (01 Jan 9999).
--Jeff Moden
Change is inevitable... Change for the better is not.
February 14, 2022 at 12:56 pm
I posed this question to the resident Microsoft DSE years ago where I work. The solution I was given was to run the SP in a Powershell step in the job with a timeout setting that stops the SP execution after the target number of seconds. Here's an example to demonstrate this. Your SP does not have the WAITFOR in it, but the demo SP has it to ensure that the SP takes at least the WAITFOR time so we can make it fail with the time limit in the job.
USE [myDb] /* change this to your database */GO
CREATE PROCEDURE [dbo].[sp_TestTimeLimit]
AS
BEGIN
WAITFOR DELAY '00:00:10'; /* 10 second delay */
PRINT 'hello, world (after delay)';
END
GO
Create a job with a Powershell step to run this SP but set the execution/timeout limit to a lower value (5 seconds) that the wait (10 seconds) in the SP.
$Server = 'myServer' #change this to your server\instance
$database = 'myDb' #change to the database name which contains the SP to execute
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server=$Server;database=$Database;trusted_connection=true;ApplicationIntent=ReadOnly;Connect Timeout=120"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
$Command.CommandText = 'EXECUTE myDb.dbo.sp_TestTimeLimit'
$Command.CommandTimeout = 5 #5 seconds limit
$Reader = $Command.ExecuteReader()
$Datatable = New-Object System.Data.DataTable
$Datatable.Load($Reader)
$Datatable.Rows
$Connection.Close()
Then run the job that has the above Powershell SP execution step. The above example fails the time limit with this error:
A job step received an error at line 11 in a PowerShell script. The corresponding line is '$Reader = $Command.ExecuteReader() '. Correct the script and reschedule the job. The error information returned by PowerShell is: 'Exception calling "ExecuteReader" with "0" argument(s): "Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding." '. Process Exit Code -1. The step failed.
Now, change the Powershell job step line that sets the CommandTimeout value to a reasonable value (say 30 seconds) and rerun the job and it should exceed.
October 3, 2024 at 9:34 pm
There's usually a maintenance job of some sort that runs nightly. For example, I have the History Cleanup job running daily at 4am.
I would just add a job step to the History Cleanup job to stop any target jobs from running beyond 4AM.
--\\ Query to see if the target job is still running. The below uses the job name but you can switch it out to use the job_id.
IF EXISTS(SELECT 1
FROM msdb.dbo.sysjobs J
JOIN msdb.dbo.sysjobactivity A ON A.job_id=J.job_id
WHERE J.name like 'Index Optimization%'
AND A.run_requested_date IS NOT NULL
AND A.stop_execution_date IS NULL
)
BEGIN
PRINT 'The job is running!'
--EXEC sp_stop_job @job_name = 'Index Optimization.FRI-8AM'
--EXEC sp_stop_job @job_id = '08C42890-97A2-4041-9A76-###########'
--// Send Out Notification
EXEC msdb.dbo.sp_send_dbmail
@recipients = 'sqlalert@freeinternet.com;'
, @subject = '[ServerName]Auto Stopped [Jobname] Running Longer than 1HR'
, @body = '[Jobname] was stopped at 9PM on [ServerName] since it reached its 1HR run limit.'
END
ELSE
PRINT 'The job is not running.'
Tung Dang
Azure and SQL Server DBA Contractor / Consultant
DataZip
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply