April 8, 2010 at 3:21 pm
Hi,
I am work on Archiving script for SQL sever 2005 database.
My approach is – write a script to copy data from LIVE to Archive database and then Delete records from LIVE database.
Just to give you bit of background –
•Current Database size is 200GB
•Has around 200 tables
•Size of 5 tables makes 40% of whole DB Size.
•Live database is mirrored to DR environment.
•Live database is also replicated to reporting environment.
•
Can you please confirm if this is the best approach or is there any other way to achieve this?
Thanks everyone is advance.
Rajesh Shukla
April 8, 2010 at 3:32 pm
Well obviously you'll be archiving those 5 tables, but what about tables they depend on? How will the archive be used? What process will be put into place for any future structure changes? How many tables contain transactional data (vs. lookup). What criteria will be used to trigger archival? If a record is archived do you archive all related data? What if some data for that record is not eligible for archive?
Sorry, those were the questions that I would be asking and trying to get answers to so I could develop a strategy.
CEWII
April 8, 2010 at 3:57 pm
Hi CEW,
Sorry if I was not too descriptive in my first post.
When I said 4 tables - all these 4 tables are divided into two transactional set of data so from business process point of view we will have two set of Archiving program (say A and B).
When we run Archiving program A - it will find all related tables transactional data and move it to another database.
This Archive script will be written in such a way that it will check following before you schedule a run –
•Database structure changes between LIVE & Archive database and will prompt you the change.
•Drive space on Archived database.
•When this job is running it will also monitor dead lock in production database and put this Archiving jobs in PAUSED mode and will start running again when there are no deadlocks
•You can also schedule this to run every day for 1 hrs and so and so.
Challenge for me is to grouping these set of tables in virtual functional set.
Constraints - I am allowed to make any major changes in production database structure and we can’t afford any downtime.
Cheers..Rajesh.
April 8, 2010 at 8:47 pm
Build "intermediate" tables to hold the PK's of all rows affected for each table. That will allow you to work unimpeded and without crushing production until you're ready to do the final deletes.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply