November 7, 2013 at 8:08 pm
I have a 64 million record table that I need to create temporary backups of.
I am cutrently using the SELECT INTO... FROM... method but this doesn't create the indexes etc. on the backup table so it would require some extra manual steps to restore the table and then re-add indexes etc. to it's original state.
Is there a better/quicker way of doing this - it currently takes around 7 hours.
November 7, 2013 at 8:24 pm
Can it be real time i.e. transactional replication or does it have to be off hours? Does it have to be created from scratch every time? or can you gradually add data i.e. CDC?
November 7, 2013 at 8:33 pm
It has to be off hours and be a point-in-time restore scenario - so I can't really use transactional rep.
I've thought about using bcp?
November 8, 2013 at 3:56 am
shindle 17293 (11/7/2013)
I am cutrently using the SELECT INTO... FROM... method but this doesn't create the indexes etc. on the backup table so it would require some extra manual steps to restore the table and then re-add indexes etc. to it's original state.
If it is routined fashion (repeatitive) then create the table defintion along with required indexes and do the "truncate load" everytime.
-------Bhuvnesh----------
I work only to learn Sql Server...though my company pays me for getting their stuff done;-)
November 8, 2013 at 7:04 am
shindle 17293 (11/7/2013)
I have a 64 million record table that I need to create temporary backups of.I am cutrently using the SELECT INTO... FROM... method but this doesn't create the indexes etc. on the backup table so it would require some extra manual steps to restore the table and then re-add indexes etc. to it's original state.
Is there a better/quicker way of doing this - it currently takes around 7 hours.
1) Monitor the process to see what the waits are. Watch File IO stalls, waitstats and also use sp_whoisactive during execution.
2) Most likely bank for the buck is almost certainly to ensure you are getting minimally logged population of the table. review Books Online for the various requirements of this, but it is often a magic bullet for activities such as this.
3) Script everything, including final NC index builds, and make it a SQL Agent job that you can execute on demand (or on a schedule if you wish).
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
November 8, 2013 at 7:18 am
As was suggested I would:
1. Create a permanent copy table with indexes and all on a separate file group (and underlying disks) from production. Can this copy table be in a separate database too?
2. Bulk copy your data out of the production copy.
3. Truncate you target table.
4. If you are in the production database, set your recovery model to Bulk Logged.
5. BCP you data into the target table.
6. Set you recovery model back to Full
7. Take a full backup
This could all be setup using a scheduled job.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply