September 13, 2013 at 6:12 am
I have a job which runs a store procedure. The proc is something like
BEGIN
STEP 1) Archive records into Table A..... (around 15K records)
from TABLE B
STEP 2) Delete some records from the table B
STEP 3) Insert new records into Table B
STEP 4) Update table B
END
This proc takes about 20 mins to run.
Jobs kicks at 7:15 and as soon as job starts i can not access records from TABLE B deleted in STEP 2. I know step 1 takes about 10 mins . My assumption was that SP has sequential processing and not parallel processing. Is it possible that it can run statement 2 along with step 1. I am confused about how it gets executed. I cant see execution plan and getting permission is like 10 day process.
September 13, 2013 at 8:59 am
Since that's all going to be part of a single transaction, you're very likely going to see all sorts of blocking, not just at the row level, but possibly at the page level, preventing you from accessing data in Table B while the transaction completes. If you need to access data while it's being run, you might want to look at setting up a snapshot isolation level on the server.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
September 13, 2013 at 9:13 am
In addition to Grant's suggestion maybe you should look at Step1. You say there are only about 15k rows but the archive portion takes 10 minutes. That is awfully slow for not many rows. Are the other 3 steps fast? I would think that archiving 15k rows should take 10-15 seconds tops.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
September 13, 2013 at 9:27 am
ekant_alone (9/13/2013)
I have a job which runs a store procedure. The proc is something likeBEGIN
STEP 1) Archive records into Table A..... (around 15K records)
from TABLE B
STEP 2) Delete some records from the table B
STEP 3) Insert new records into Table B
STEP 4) Update table B
END
This proc takes about 20 mins to run.
Jobs kicks at 7:15 and as soon as job starts i can not access records from TABLE B deleted in STEP 2. I know step 1 takes about 10 mins . My assumption was that SP has sequential processing and not parallel processing. Is it possible that it can run statement 2 along with step 1. I am confused about how it gets executed. I cant see execution plan and getting permission is like 10 day process.
1) Try sp_whoisactive as soon as you kick off the sproc to look for locking blocking. GREAT free resource found on sqlblog.com. You may not have permission to run this either.
2) The statements in a proc do not EVER execute at the same time. They are serial as you expect.
3) Can you see the ESTIMATED execution plan? That could be good enough to identify that you are doing a nasty table scan or something silly to delete just a few rows or for other parts of the sproc.
4) The WITH (ROWLOCK) hint could be helpful. See Books Online for info.
5) I question the order of steps 3 and 4. Assuming your INSERT already has all the correct information for every row, doing the INSERT before the UPDATE simply puts more rows in the table to subsequently be UPDATED. Not efficient coding practices (although rather common at clients I go to!).
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
September 13, 2013 at 10:26 am
TheSQLGuru (9/13/2013)
4) The WITH (ROWLOCK) hint could be helpful. See Books Online for info.
Or could make it worse. 15000 rows locked at the row level is above the threshold for triggering an escalation to table-level locks.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
September 13, 2013 at 10:28 am
ekant_alone (9/13/2013)
My assumption was that SP has sequential processing and not parallel processing. Is it possible that it can run statement 2 along with step 1.
Your assumption is correct, statements in a stored procedure execute in sequence. Step 2 won't start until step 1 completes. Step 1 will lock rows in TableB though. Should just be shared locks, but they will be locked.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
September 14, 2013 at 7:17 am
GilaMonster (9/13/2013)
TheSQLGuru (9/13/2013)
4) The WITH (ROWLOCK) hint could be helpful. See Books Online for info.Or could make it worse. 15000 rows locked at the row level is above the threshold for triggering an escalation to table-level locks.
True. But if it is a busy table there will almost certainly be other row/page locks that will prevent the table lock escalation, thus allowing the row locking to continue (up to memory limitations IIRC) in 1250-lock increments, where it will try for the table lock again iteratively. Other than memory concerns this will probably be a better concurrency scenario in many cases than simply attempting to get table lock at the start (and possibly being blocked for long periods) or actually GETTING the table lock at the start and blocking other activity for duration of the transaction.
YMMV, and testing is important as it always is when you use hints!
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
November 1, 2013 at 1:42 pm
OP,
I am just curious ... how do you know step 1 takes 10 minutes? Is there some way that you actually know that or is it an assumption based on something else?
If you can read from table B, but just not the records expected to be deleted, that would indicate that your time estimates are off and that the step in question has already been completed. As has been stated, the 2nd statement will not execute until the 1st has completed. If you can't read from table B in general because it is locked that is a separate issue.
Personally I like using SSIS for ETL tasks like you're describing rather than using single procs that perform many separate tasks. For each separate action I use a distinct step and if I have any issues I can easily discover exactly how long each step is taking and where the errors are. Some of this advantage could be had merely by separating the statements into separate procs and then calling the procs in order via the job. Of course, there are times when the code needs to be kept together and your case could very well be one of those times.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply