July 11, 2013 at 1:41 pm
hi
i need to Create process to archive the records ,anybody has any script or any ways to do it
thanks
July 11, 2013 at 2:01 pm
Archiving can be quite a complex subject. I would search this site and read the numerous articles on archiving,
July 11, 2013 at 11:47 pm
I've used this exceptional FREE auditing solution to create not only an audit trail but adapted it for archiving at the same time as well. Yes, it's complicated, but the tools you need are in there!
http://autoaudit.codeplex.com/releases/view/42619
July 12, 2013 at 7:25 am
well archiving in the sense i need to dump 1 year data from 1 database to another.
but the problem is i have 60 million rows and i need this process to work faster.
any solution
July 12, 2013 at 11:34 am
Script out the create table statement for the table you need to
copy WITHOUT ANY KEYS. the target table should be a HEAP.
Script all of the necessary keys separately then apply them after the copy.
You can do this easily by right-clicking on the db name then Tasks->Generate Scripts. Under advanced options make your selections and do a SCHEMA ONLY copy to create the script for the target table. Run just the CREATE TABLE part of the script and don't create a PK or any other keys.
Then run the following:
INSERT INTO dest_table
SELECT *
FROM source_table
Now with the data copied into the new table, apply the script to create your PKs and/or clustered indexs first then apply any other indexes or FKs generated by the source table scripting.
With 60 million rows all of this could take a while, but it should work.
If you ever plan to do this more than once though, it might make sense to turn it into a stored procedure and use SSIS which does such updates much faster than you can do it in SSMS (which has memory limitations). SSIS also can be tuned to manage the data transfer in batches so as to not tie up all your bandwidth.
July 12, 2013 at 12:33 pm
I don't usually work with that much data but I think you should batch the insert to minimize the impact on the system and to prevent transaction log growth.
July 12, 2013 at 5:16 pm
This would be a great place to use Table Partitioning which, after being built, would allow the nearly instantaneous "movement" of rows. It does take the "Enterprise Edition" to do it, though. A pretty decent introductory example of how easy it is to "move" data from one table to another is included at the following link:
http://msdn.microsoft.com/en-us/library/ms191174(v=sql.105).aspx
Such table data "movement" is nearly instantaneous no matter how many rows are being "moved".
If you don't have the Enterprise edition, you might be interested in "Partitioned Views" which you can also find in Books Online. Be advised that they've been "deprecated".
--Jeff Moden
Change is inevitable... Change for the better is not.
July 15, 2013 at 1:45 pm
i need to do in a batch , so i need some loop in my sp ,
can you give me any example
July 15, 2013 at 2:09 pm
Here is one example.
http://www.sqlservercentral.com/articles/transactions/69132/
July 15, 2013 at 8:24 pm
riya_dave (7/12/2013)
well archiving in the sense i need to dump 1 year data from 1 database to another.but the problem is i have 60 million rows and i need this process to work faster.
any solution
riya_dave (7/15/2013)
i need to do in a batch , so i need some loop in my sp ,can you give me any example
There's nothing that's going to make it any faster... not even the trick that Chrissy just posted (hopefully you're not doing things in a RBAR fashion for this).
Does one year of data make up the 60 million rows?
--Jeff Moden
Change is inevitable... Change for the better is not.
July 15, 2013 at 8:26 pm
Hmmmmm..... I just had a thought and have to take back the "there's nothing faster". Can you use BCP and is there a commond folder that both servers can see?
--Jeff Moden
Change is inevitable... Change for the better is not.
July 15, 2013 at 8:36 pm
Chrissy321 (7/15/2013)
http://www.sqlservercentral.com/articles/transactions/69132/%5B/quote%5D
Careful now... that's a full up RBAR solution even if there are batches of rows being processed. I'd really hate to see someone fall into that particular nest for something like archiving 60 million rows.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply