December 10, 2012 at 12:44 pm
Dear All,
Hi! I need to generate Insert scripts of more than 1 million records & execute the same on another server. The normal frequency is weekly but, but sometimes I have to run it twice / thrice in a week.
I need to know that how we do make batch of these insert statements so that instead of executing individual statements we can execute group of Insert statements. The batch/group can be of 2000 rows or 5000 rows to improve the performance.
I also need to know that will it increase the performance as we are inserting multiple records in a single batch?
December 11, 2012 at 2:28 am
You can try BCP. It has options to inset rows in batches.
BCP is known to be faster than other methods.
December 11, 2012 at 2:20 pm
I would suggest to create and run the ssis package for loading the data to another server.If that is the requirement.
December 11, 2012 at 2:41 pm
Last I knew BCP is incompatible with FULL RECOVERY MODE and I don't fly without a parachute.
We got a huge performance increase by sending a large "multi-row" XML string into a stored procedure, parsing it into a table variable and inserting the rows in one shot. It is messy but fast and you get to keep your reliable backups.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply