January 20, 2018 at 7:27 am
Hi,
I am getting a duplicate primary key failure message when two SPs called the same SP at the same time. The SP that is getting called by the two SPs has transaction wrapping as shown below
DECLARE @startingTranCount int
SET @startingTranCount = @@TRANCOUNT
IF @startingTranCount > 0
SAVE TRANSACTION mySavePointName
ELSE
BEGIN TRANSACTION-- …
Since the transaction name is same, is it possible that it is causing duplicates. I can change the transaction name to be unique by using the following snippet. Before I do that, I wanted to see if the same transaction name is causing the duplicates or something else.
DECLARE @mark-3 CHAR(32) = replace(newid(), '-', '');
DECLARE @trans INT = @@TRANCOUNT;
IF @trans =0
BEGIN TRANSACTION@mark;
ELSE
SAVE TRANSACTION @mark-3;
Thanks,
Sridhar.
January 22, 2018 at 1:14 am
No, the transaction name is not causing the duplicates.
If SQL says you have duplicate rows, then you have duplicate rows. Probably due to using the read-then-insert pattern without sufficient isolation enforced (the defaults are not enough)
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
January 22, 2018 at 6:57 am
Thank you for the response. You are correct that it is doing read to get the last id, increment by 1 and then insert into table with new id. Would you be able to tell what the isolation levels should be so that the duplicates won't be inserted? Also, I am thinking this impacts performance. What are the possible alternatives?
January 22, 2018 at 7:06 am
Sridhar-137443 - Monday, January 22, 2018 6:57 AMWhat are the possible alternatives?
I'm not sure - I can't see any INSERT statements. Please will you post your code?
John
January 22, 2018 at 2:31 pm
Sridhar-137443 - Monday, January 22, 2018 6:57 AMThank you for the response. You are correct that it is doing read to get the last id, increment by 1 and then insert into table with new id. Would you be able to tell what the isolation levels should be so that the duplicates won't be inserted? Also, I am thinking this impacts performance. What are the possible alternatives?
Yes, it'll impact performance. your choice, performance or correct results.
I have a part-written blog post on this.BEGIN TRANSACTION
DECLARE @next INT;
SET @next = (SELECT (MAX (ManualID) + 1) FROM TestSequence WITH (TABLOCKX, HOLDLOCK));
INSERT INTO TestSequence
VALUES (@next, @@SPID);
COMMIT TRANSACTION
That seems to work, and yes, the tablockx is necessary.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
January 22, 2018 at 2:52 pm
Hi Gail, Thank you for the code snippet. I thought that when you open the transaction (BEGIN TRANSACTION), all the tables inside that block will be locked until the transaction is committed. Is that not right?
Thanks,
Sridhar.
January 22, 2018 at 3:33 pm
Sridhar-137443 - Monday, January 22, 2018 2:52 PMIs that not right?
No, it's not.
Locks are taken as needed, and how long they're held for depends on the isolation level and the lock that's taken
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply