April 1, 2016 at 5:49 am
At my work site occasionally the nightly process that refreshes a SQL database from the main (SAP) tables doesn't run until well into the business day. When that happens deadlocks often occur, causing an already late process to come to a screeching halt. Is there a way to give the batch process priority over online? Or at least lengthen the time that a process waits until it self-immolates (or whatever processes do when they die).
April 1, 2016 at 6:04 am
You can use a SET statement to set deadlock priority to high, or you can tune code so that it runs faster and doesn't deadlock.
It's not about how long it waits, a deadlock, by definition, is a locking/blocking loop that, if left, would never resolve. Hence when SQL identifies a deadlock, it kills one of the sessions immediately. There's no point in waiting, none of the sessions involved would go any further if it did wait.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
April 1, 2016 at 6:08 am
See SET DEADLOCK_PRIORITY (https://msdn.microsoft.com/en-us/library/ms186736.aspx) and use that for your batch processes. Set them HIGH and you should be good. This could be easier than setting LOW for all of your other processes, but either way should work.
What you really need to do though is tune the data movement process to get it to run in a shorter time.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
April 1, 2016 at 8:39 am
I agree with the other answers, so this is just piling on.
Fundamentally, deadlocks are a performance issue. Yeah, if the code is written in such a way that locks occur in different order, it exacerbates or creates the problem, but, at the core, if the processes involved ran fast enough, deadlocks won't occur (or at the very least, occur a lot less). Tune the data load process.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 1, 2016 at 8:51 am
Seggerman-675349 (4/1/2016)
At my work site occasionally the nightly process that refreshes a SQL database from the main (SAP) tables doesn't run until well into the business day....
First, if "refreshing" means a complete truncate and re-load, then consider implementing a different load process that only inserts or rows as needed. Also, perhaps consider inserting using a smaller batch size, allowing readers a chance to get some work in edgewise and complete what they're doing.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
April 1, 2016 at 9:02 am
The following snippet is what I used to handle a deadlock issue we were experiencing of a process that updated a TableOfContents table. We found that this process was always the victim when a deadlock occurred so we needed to ensure that the batch being processed was still processed without restarting the whole process each time.
d e c l a r e -- added spaces to allow code to post from current location.
@ErrorNumber int,
@ErrorLine int,
@ErrorMessage nvarchar(4000),
@ErrorProcedure nvarchar(128),
@ErrorSeverity int,
@ErrorState int;
while 1 = 1
begin
begin transaction;
begin try;
-- <-- code goes here -->
commit;
break;
end try
begin catch
rollback;
select
@ErrorNumber = ERROR_NUMBER(),
@ErrorLine = ERROR_LINE(),
@ErrorMessage = ERROR_MESSAGE(),
@ErrorProcedure = ERROR_PROCEDURE(),
@ErrorSeverity = ERROR_SEVERITY(),
@ErrorState = ERROR_STATE();
if @ErrorNumber = 1205 -- Trap deadlock
begin
WAITFOR DELAY '00:00:00.05' -- Wait for 50 ms
continue;
end
else begin
-- All other errors
raiserror(@ErrorMessage,@ErrorSeverity,@ErrorState);
end
end catch
end
Also, understand due to the nature of our database the code being run is actually done using dynamic SQL, but this should help you develop a solution.
April 1, 2016 at 10:24 am
You could also consider using an A/B table approach. That is, the nightly load goes into a different table name. When the load process is complete, either the table gets renamed to be the main table or the view referencing the table gets changed (each method has its own pluses and minuses, although rename is probably more typical).
For example:
main table is named "master_table".
The nightly process loads a "master_table_refresh" table.
Once the master_table_refresh is fully (re)built, you rename "master_table" to "master_table_old" and rename "master_table_refresh" to "master_table".
Edit, add (I hit Submit too soon by mistake):
Snapshot isolation could be another option. Be aware, though, that it has relatively high overhead. Make sure you allow for an extra 14 bytes in every row of the table, and make sure tempdb can handle any increased load.
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
April 1, 2016 at 11:13 am
on behalf of my client - thanks all
the process already makes most of its updates to a staging table - and only at the very end copies the data back onto the SQL tables
- making an update only process is not feasible at this point
I used Jeff Moden's suggestion to speed up some of the steps - but not all - I may need to further tweak the logic
(by the way, THANKS, again 🙂
I'll turn over the setting of priority - it may impact some customers in different time zones
April 1, 2016 at 8:17 pm
Seggerman-675349 (4/1/2016)
on behalf of my client - thanks allthe process already makes most of its updates to a staging table - and only at the very end copies the data back onto the SQL tables
- making an update only process is not feasible at this point
I used Jeff Moden's suggestion to speed up some of the steps - but not all - I may need to further tweak the logic
(by the way, THANKS, again 🙂
I'll turn over the setting of priority - it may impact some customers in different time zones
Thank you but I've made no entry on this thread. Where did I make such a suggestion? I believe you meant "Gila Monster" a.k.a. Gail Shaw?
--Jeff Moden
Change is inevitable... Change for the better is not.
April 2, 2016 at 8:31 am
a while back you posted a method of adding a running total to a temp table / table variable to get around having to read each record one at a time
I incorporated it to the longest steps in the process - but not all the steps
the nightly process now takes about 25 minutes when it runs at 1 to 5 am when it should
before the improvement it used to take easily an hour and a half
if the operations staff fail to notice an abnormal termination - again! it's a relief that the job is at least a lot more streamlined if and when it has to run in the middle of the morning
the gift that keeps on giving
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply