February 18, 2009 at 10:13 pm
Hi All,
I have a program that supposed to make multi thousands of inserts and updtaed within one operation, is it applicable to do it through for loop, if not!, what will be the optimal soloution
February 19, 2009 at 5:55 am
You got to give more details than this to get a good answer. What is the table structure, and from where does this New/Update data come from.
-Roy
February 20, 2009 at 9:46 pm
Well, the structue is a normal table structure. However, i do have a web based application ASP.NET with VB.NET, and i would like to insert thousands of rows in one transaction, so my question is do i have to make a new connection to the DB in every iteration(iterations could be thousands in one transaction), or there is another soloution!!
February 21, 2009 at 4:03 pm
What you call a "normal" table structure may be the best or the worst or somewhere in between. You asked a question with no real information to go on especially where performance goes. All we can do is give you some generic answers at this point because we don't know enough about the table structure, it's indexes, FKs, PK, or clustered index.
If your clustered index is setup so that the new rows are inserted using some temporal base like and IDENTITY column or a DateEntered column, then inserting a large number of rows should move right along without being broken up.... unless...
If you have a large number of indexes, then it will be slow because the indexes must also be updated. Since they are not likely to be based on something temporal (with some exceptions, of course), they will create new extents (8 8kb pages) when they need to.
If your clustered index isn't based on something temporal, then you will also get some pretty nasty splits in that index... remember that a clustered index is (basically) the data itself.
With all of that in mind, it may or may not lock the whole table depending on the logical distribution of the data. Using a loop to limit the number of rows being inserted is absolutely useless unless you put in a time delay to allow other processes using the table some head room to operate.
And, finally, using a transaction on inserts is like hitting the same spot twice with a bomb. An insert already is a transaction that will be rolled back if the insert fails. If you are doing other things during an explicit transaction, like selects, updates, and other inserts, you also run the risk of deadlocks depending on what you are doing and how long each transaction lasts.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply