commit small chunks of data when Large Dataset inserted to table

  • When inserting large number of rows into SQL Server 2000 database, we need to commit the transactions of small chunks of data instead of updating/Inserting of large dataset. commiting small chunks of data clear any system resource issues such as memory and CPU utilization. Updating/Inserts of large data sets always a problem to memory.

     

    I have to insert 85,000,000 records to a table in a transaction and have limited resources. So I want to use commit statement in transaction so that commits trasactions for every 25,000 records. If any one knows T-SQL script for this process please help me. If you need more information please feel free to ask.

     

    Thanks,

    BK

  • Couldn't you do something like what was suggested in this post.  Obviously you'd be running your insert statement instead of deleting...  Also insert 25000 per minute instead of 10000 upping your result to 15 mil rows per hour.  You could tweak that a bit as suggested by changing the amount inserted and the frequency.

    http://www.sqlservercentral.com/forums/shwmessage.aspx?forumid=8&messageid=267372

     

    To help us help you read this[/url]For better help with performance problems please read this[/url]

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply