September 2, 2013 at 4:08 am
Hi
I am trying to write a Bulk import procedure, i am using table value parameter to insert thousands of rows in the table.
The problem is if any error happens,i am not able to know which row in the table value parameter caused it.
I am not using transaction in stored procedure
here is the code
--Table creation
CREATE TABLE DataImport (id int identity(1,1),Name varchar(2))
Insert INTO DataImport
values('aa'),
('bb'),
('cc'),
('dd')
--SELECT * FROM DataImport
--Table type creation
CREATE TYPE udt_Table AS TABLE (ud_name varchar(3))
--stored procedure
CREATE PROC Usp_DataImport
@TVP [dbo].[udt_Table] READONLY
AS
BEGIN
DECLARE @Insert Table (ActionType nvarchar(10),NewestName varchar(3),OldestName vaRCHAR(3))
MERGE DataImport AS [TARGET]
USING @TVP AS [SOURCE]
ON [TARGET].Name = [SOURCE].ud_name
WHEN MATCHED THEN
UPDATE
SET Name = [SOURCE].ud_name
WHEN NOT MATCHED THEN
INSERT (Name)
VALUES([SOURCE].ud_name)
OUTPUT $ACTION, Inserted.Name,Deleted.Name INTO @Insert;
SELECT * FROM @Insert
END
--Executing Procedure
--First proc Exection sucessfully done
DECLARE @dd AS dbo.udt_Table
INSERT INTO @dd
values('ee'),
('aa'),
('XX'),
('GG'),
('66')
EXEC Usp_DataImport @TVP = @dd
--Second proc Exection fail this will generate error due to varchar length (String validation can be done frontend to avoid this problem...i am using this to generate the error)
Here i want to get at which row error happens...so that i will return to application to make changes and then insert
DECLARE @dd AS dbo.udt_Table
INSERT INTO @dd
values('ee'),
('aa'),
('XX'),
('GG'),
('GGG')
('66')
EXEC Usp_DataImport @TVP = @dd
Thanks
September 2, 2013 at 3:58 pm
Unfortunately, this is one of the drawbacks with working set-based. If you insert rows one a time, finding the bad data is trivial. But that is of course far less efficient. A possible strategy is to trap the error with TRY-CATCH and resort to a loop in case of an error.
There is very good Connect request for this problem on https://connect.microsoft.com/SQLServer/feedback/details/774754/new-virtual-table-errors-it-would-analogous-to-the-deleted-and-inserted-tables#details that you could vote for.
[font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]
September 2, 2013 at 9:28 pm
Erland Sommarskog (9/2/2013)
Unfortunately, this is one of the drawbacks with working set-based. If you insert rows one a time, finding the bad data is trivial. But that is of course far less efficient. A possible strategy is to trap the error with TRY-CATCH and resort to a loop in case of an error.There is very good Connect request for this problem on https://connect.microsoft.com/SQLServer/feedback/details/774754/new-virtual-table-errors-it-would-analogous-to-the-deleted-and-inserted-tables#details that you could vote for.
Thanks for the reply Erland Sommarskog
I voted that feature ...microsoft should add that feature.
you said to use TRY-CATCH ...but the problem is any error happens it rollback all the transaction..i am not using any transaction may be it is using batch transaction which causing it rollback.
Any workaround or i have to use loop to insert the data(Performance issue)
September 3, 2013 at 5:14 am
As you long as your stored procedure is not part of an outer transaction, it is doable:
BEGIN TRY
MERGE ...
END TRY
BEGIN CATCH
DECLARE cur CURSOR STATIC LOCAL FOR
SELECT .. FROM @tvp
OPEN cur
WHILE 1 = 1
BEGIN
FETCH cur INTO @var1....
IF @@fetch_status <> 0
BREAK
BEGIN TRY
MERGE ....
END TRY
BEGIN CATCH
PRINT 'Errors for ' + ....
END CATCH
END
DEALLOCATE cur
END CATCH
But if your procedure is part of a greater transaction, this may not work as the error may doom the transaction and you cannot do the fallback.
And of course it is a burden to maintain two code paths..
[font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]
September 3, 2013 at 6:01 am
We are handling this from frontend ..if any error comes we rollback and divide the number of rows and insert it...so this process will go on until all the rows insert.
September 3, 2013 at 2:13 pm
Yeah, that works too. And slicing the data in half is certainly a possibility. If you insert millions of rows, and you only have a single errors this may be more efficient than to do a loop one-by-one. Particularly, if you are lucky and the first half is good. (Then you know that you need to slice the second half already.)
[font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]
September 3, 2013 at 4:22 pm
Surely this is a case for validation at the front end?
There always used to be a rule that applications validate input...
MM
select geometry::STGeomFromWKB(0x0106000000020000000103000000010000000B0000001000000000000840000000000000003DD8CCCCCCCCCC0840000000000000003DD8CCCCCCCCCC08408014AE47E17AFC3F040000000000104000CDCCCCCCCCEC3F9C999999999913408014AE47E17AFC3F9C99999999991340000000000000003D0000000000001440000000000000003D000000000000144000000000000000400400000000001040000000000000F03F100000000000084000000000000000401000000000000840000000000000003D0103000000010000000B000000000000000000143D000000000000003D009E99999999B93F000000000000003D009E99999999B93F8014AE47E17AFC3F400000000000F03F00CDCCCCCCCCEC3FA06666666666FE3F8014AE47E17AFC3FA06666666666FE3F000000000000003D1800000000000040000000000000003D18000000000000400000000000000040400000000000F03F000000000000F03F000000000000143D0000000000000040000000000000143D000000000000003D, 0);
September 4, 2013 at 1:18 am
mister.magoo (9/3/2013)
Surely this is a case for validation at the front end?There always used to be a rule that applications validate input...
Yes and no. First of all, it depends on the kind of application. It may be an ETL application, and in that case you don't validate in the client, because all you have is files. (And I more or less assume that this is ETL.)
But even if the data comes from user entry, the business rules may be too complicate to warrant to duplication of them in the front-end. Things like mandatory fields, foreign keys etc are simple to enforce, but you may have more complex dependencies.
[font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]
September 4, 2013 at 7:33 am
Are you able to put a try catch block around your SP and use SCOPE_IDENTITY()
September 5, 2013 at 5:17 am
In your stored proc, why not simply check for rows which will cause an error (datalength > 2, in this case) , select them and return a message to the client?
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply