August 17, 2009 at 4:01 pm
I'm putting together SSIS packages to replicate tables from our production database farm to our testing and development environments. We used to do it on demand, but the requests are too often and it makes sense to automate. My question is which of the following is the best way to replicate 25 tables:
1. Truncate the destination table, then replicate the data. Repeat for all tables
2. Use the transfer sql objects task
I think both ways will work, but which one makes more sense?
August 17, 2009 at 7:51 pm
If they're all on a SAN, many SAN either come with or you can purchase an extra bit of software that goes with them to "snapshot" them virtually on demand and in only seconds.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 18, 2009 at 8:27 am
We do have a san, but that option is not available to us at this time.
I have option 1 running currently, but its caused quite a bit of transaction log usage. I can manage it quite easily, but does option 2 cause the same amount of transaction log usage? Anyone know?
August 20, 2009 at 1:40 am
Not to go into this in great depths but yes option 2 will generate a fair bit of logging. Have a look at your databases recovery model. The do some reseach into weather your set up will allow you to use "Simple" recovery model. That should sort out any logging problems you have until you can review your extraction method. A quick point in another direction is to do an incramental load, ie. only load the data that is missing/ changed from the previous load (Theres loads of stuff out there). Some things to take into account are volume of data, how volitile history is, frequency of loading ect.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply