May 29, 2008 at 12:12 pm
I've inherited a situation where I have two databases (legacy and target), both with the same schema (for the purposes of this question) and there is no referential integrity. The target database has been in production for a while now and the legacy database is still being used. I have to merge the data from the legacy db into the target db and keep the fk's in tact. What is the most efficient way to do this? Is there an open source tool or a 3rd party tool? Can this be done with team system data compare? Dts? Do I have to write something to handle this manually?
Thanks!
-Matt
May 29, 2008 at 12:23 pm
you could do a data compare. there are various tools available to accomplish this. Visual Studio for Database Pros and redgate sqldata compare are the 2 I have used, and both worked well.
May 29, 2008 at 12:43 pm
You may want to take the approach of scripting the referential integrity, dropping it, and then adding it back in after the data has been migrated.
Migrating the data in the correct order can be painful and it is a bit unnecessary in your case. Wherever you can, I would migrate the keys (if you have identity columns as primary keys, use identity inserts) so you do not have to deal with ending up with a new key value and mapping all child records.
When you add the constraints back on, check the existing data and deal with data that violates the constraints at that point. It will probably be far easier.
May 29, 2008 at 12:44 pm
They resolved the foreign key issues? Here's what I'm talking about:
The auto-number pk's in legacy database are already taken in target database so new pk's would be assigned thus negating the fk's in the other tables.
May 29, 2008 at 12:45 pm
By the way, thank you for your replies so far..!
May 29, 2008 at 12:49 pm
Oh, I did not read the post very well.
So I think you are saying that you have PK values already in your destination database that are coming from your source database. That's a bummer.
My thought is to cheat where you can and add a number to your source PK values to make them unique when compared to the values in the destination database.
May 29, 2008 at 12:53 pm
no, they did not resolve the issue you are describing in your last post. they will not work for key values that are already being used in both databases for different peices of data. The only way I can think of is time consuming.
say you have table A with a primary key of column1 that table B and C reference.
Say the same tables in your destination database contain values 1-100
you would have to update column1 in tables A and the referencing B and C columns and increase thier values by 100. you could then insert the data across.
May 29, 2008 at 2:21 pm
an idea that might work would be to add an 'oldkey' column to your prod database, pump the data from old to new and allow the new db to assign a new identity.
You could then update your FK's pretty easily by referencing the oldkey fields.
Craig Outcalt
May 29, 2008 at 2:23 pm
not a bad idea, this will preserve the old key incase a mistake is made. you would then be able to fix any errors in your process.
May 29, 2008 at 2:40 pm
On more hint if you go that route: Put the db in restricted_user mode first... Even though it's an online operation, nobody wants to see all the incorrect references in the app before you get them cleaned up.
and do a backup first! :w00t:
Craig Outcalt
May 29, 2008 at 2:44 pm
Thanks guys! I'm in the process of bustin' it out...
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply