June 24, 2008 at 1:27 pm
I have a table with over a billion rows. I have created another table, that is partitioned and with the same indexes that are also partitions. I want to know transfer all of the records from the original table to the new table. What is the best approach for this?
I tried inserting 100 million records, and the log grew up to 225 GIGS!. I have 250 gigs for only the logs, so that was definately run out if I insert all of the records. Any suggestions?
I am using Sql Server 2005. Please let me know if there is any way of shutting of logging when doing such a large insert. The database is already in simple recovery mode.
Thanks.
June 24, 2008 at 2:25 pm
Set the recovery mode to SIMPLE or Bulk Logged... use SELECT INTO to make the tables and transfer the data. Then, add indexes, keys, triggers, and constraints... use NOCHECK on the constraints because you already knew the data met the constraints on the other table.
Do a backup when you're done.
--Jeff Moden
Change is inevitable... Change for the better is not.
June 24, 2008 at 2:27 pm
You will need to probably use SELECT INTO and use the SWITCH statement to move data into the partitioned one. I am not sure though that this can be done "online" ( meaning letting users change the source data )
Just my $0.02
* Noel
June 24, 2008 at 2:28 pm
Hehehe ... Jeff beat me to the punch 😀
* Noel
June 24, 2008 at 3:33 pm
Noel pointed out right its a very good idea to partition the table right away and then insert data in batches based on partition keys....
Prakash Heda
Lead DBA Team - www.sqlfeatures.com
Video sessions on Performance Tuning and SQL 2012 HA
June 27, 2008 at 8:45 am
What I decided to use was a simple .net script, using SqlBulkCopy. This worked out pretty well and faster then other things I tried.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply