November 22, 2002 at 2:52 pm
This issue is between Visual Basic and SQL Server using the ADO 2.6 object.
I am writing an application that writes thousands of INSERT's into SQL server every second. I use the ADO Connection object and the execute method for my INSERT's. For performance reasons I can not do INSERT's one-by-one. I have to group my INSERT's into one long string (containing aroung 1500 INSERT statments) then I use the execute method of the connection object. The performance is great, but I have a couple issues.
1) After the INSERT's complete I only get back that one record was affected when I know that 1500 were affected.
2) If an INSERT fails after the first INSERT I do not get any error messages about the INSERT that failed. I only get an error message if the first one fails.
Here are my questions:
1) Is there some SQL code I could put in my long string of INSERT's that would tell me if any INSERT's failed?
2) Is there a way to get the records affected to reflect the number of INSERT's?
3) Is there another way of accomplishing this task (i.e. a way to BATCH all my INSERT's so SQL server could return to me a true number of Errors and records affected)?
I did look at BULK INSERT's with XML but that will not work for my process.
Thank you for any guidance you can give me.
Garth
November 22, 2002 at 3:07 pm
Try putting the Insert's into one transaction.
Begin Transaction
Your Code
End Transaction
November 22, 2002 at 3:34 pm
Thank you for your response,
I tried using the Transactions, but that does not solve my issue. As long as the first insert does not have an error no errors will be generated.
I even tried using GO between inserts and that did not work either.
November 22, 2002 at 4:34 pm
You only get one row affected as this is the first message that is sent back to the client. You may also find that not all the inserts are performed because of this.
ADD
SET NOCOUNT ON
SET XACT_ABORT ON
to your string. This will force all the inserts to suceed or fail. And put it in a transacton.
You cannot get the total number of rows affected. To make this simpler use a recordset and use updatebatch. This will generate the inserts for you.
Simon Sabin
Co-author of SQL Server 2000 XML Distilled
http://www.amazon.co.uk/exec/obidos/ASIN/1904347088
Simon Sabin
SQL Server MVP
http://sqlblogcasts.com/blogs/simons
November 22, 2002 at 4:35 pm
What about OPENXML?
Simon Sabin
Co-author of SQL Server 2000 XML Distilled
http://www.amazon.co.uk/exec/obidos/ASIN/1904347088
Simon Sabin
SQL Server MVP
http://sqlblogcasts.com/blogs/simons
November 26, 2002 at 8:07 am
Thanks for the response Simon,
I tried using OPEN XML and if I was only inserting into one table it would be fine, but I am inserting into about 25 different tables. I did get a better records affected result, but it groups it by the tables. So now at least I get a records affected for each table. The biggest issue is that the OPEN XML is about 300 - 400 milliseconds slower then slamming the insert in one long string.
Are there any other techniques for inserting large amounts of data more efficiently?
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply