Replecation/Log Shipping Query??

  • Hi there,

    I'm having difficulty finding information on how to do the following....Can anybody help?

    We have a helpdesk system running on a SQLServer 2000 server that has grown very rapidly as is is a 24hr system. The application has become a bit sluggish even following some pretty severe DB tuning.

    The users of the system say they only need to have 6-months worth of calls live on the system, however, the data older than this cannot be permanently deleted and still needs to exist for reporting/auditing etc. A second server is available to me to put an instance of the DB on, the plan is to have a live box with 6-7months worth of  data. Ideally, every month a job would take place to cut the older data out of the live system and update the other system without deleting what is already there - this way it could still be used as an archive for selected users to access and report on.

    Can anybody offer any advice?? I've been researching the various types of replication and log-shipping but havn't come accross a similar situation to mine although I can't be the 1st to want to do this. Am I looking into the wrong area? Any tips would be greatly appreciated!!

     

    Thankyou

  • Consider using DTS for this.  Replication and log shipping may be a bit complex for what you want to do.  You could create a DTS package and schedule it for monthly execution.

    The package could have a Transform Data task for each table that queries for data older than 6 months (assumes you have a data column) in the production database and inserts into tables in archive database.

    Greg

    Greg

  • I still haven't found the best solution for that but so I am using two different way to deal with that.

    One, using DTS package to copy the data over the 2nd box and delete from live. This is simple way and works great as long as you choose the right data to move. That might have a little over head on live depends on how you setup and depends on how you deal w/ it. I am using "Data Driven Query Task" from DTS to move and delete file.

    Two, setup replication from Server A to Server B. I am using Transactional replication and have another database that has history of all data on Server B and copy the data like older then X day to 2nd database and then delete older then X+10 days. That way there is some duplicated data on live and 2nd server but you have 10 days to fix the job if the job fails.

     

  • Thanks both for your help.

  • Taking a cut with DTS seems sound (though I like transactional replication!). If you've got the Enterprise edition you could even set up a distributed partioned view to keep it easily accessible.

    It still seems extreme though. Larger amounts of data doesnt necessarily translate to slower performance, just more space used. Might be worth having someone else take a look, see if they can offer suggestions to optimize what you have.

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply