October 17, 2011 at 5:30 pm
Hello Everyone,
Hope all is well with you.
I am trying to copying data from table from a database to a new table in a different database and delete it in the old database. To do this I ended up writing a query with 6 joins for copying only. The data I am trying to copy is almost 1 million records and I am planning on putting the script in a job and have it run at late hours which is supposed to run daily. Before going to implement it in production I tested it on a test environment and its consuming a lot of time for copying only, and I still have to run my delete job. Is there a way to quicken up this process. This data is for archiving purpose only and only the active transactions remain in the database.
Please let me know what is the best approach.
Thanks a bunch.
“If your actions inspire others to dream more, learn more, do more and become more, you are a leader.” -- John Quincy Adams
October 17, 2011 at 6:32 pm
Bulk Insert will almost always be the fastest way to tranfer large amounts of data.
Best way is to create a SSIS package to transfer the data and schedule a job to run the SSIS package.
It will also help if you set the Recovery Model to Bulk-Logged as it will minimize logging as long as your destination table is empty or fit certain criterias.
-------------------------------------------------
Will C,
MCITP 2008 Database Admin, Developer
October 18, 2011 at 1:10 am
>> To do this I ended up writing a query with 6 joins for copying only. The data I am trying to copy is almost 1 million records<<
would you mind to put the query with your table structures with some sample data.
October 18, 2011 at 3:15 am
Did you try disabling all the constraints(Clustered & Non Clustered Indexes), Foreign Keys, Triggers Etc associated with the tables?!?!
And once the operation is completed you can again enable it.
Thanks & Regards
Sathya.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply