December 15, 2010 at 11:46 am
What is the easiest way to copy the database schema and some data from production db to test db.
The prod db contains huge amount of data. However ,Test db needs fewer rows( may be 10000 rows ) in each table for testing purposes with latest deployed code in production.
The table structure from production should be copied but with fewer rows. So I think Backup & restore of db wont work in one step.
What is the industry standard way to do this?
December 15, 2010 at 12:05 pm
at our shop, it's still starts with the basics:
backup...restore.
then a custom script we wrote munges the data....deletes records and child records based on FK hierarchy...which can be a real pain. there's not an easy built in way to do that, unless you alter every FK to cascade deletes...
but you know that's not a hard and fast rule...some tables need to have the cascade set to null, others to delete, and it's based on an analysis of the data/biz rules, not the schema.
after removing data, the script goes on to munge the data by replacing names,addresses,cities,etc from a cross joined and randomized lists, and repeats the masking process for any column names identified previously as having sensitive-type data. some fields get set to random numbers as well.
after all that, it gets the roles added and applied to it , then the users added to the roles so the developers have access and can see it.
once you script it all out once, it's very reusable, but there's a lot of effort in getting it set up once.
Lowell
December 15, 2010 at 6:48 pm
You can setup replication as one of the options. Set filter on tables to limit rows that will be replicated. I don't think this is the easiest way, but could work if you need only set of tables, for example.
I agree that industry standard is still backup-restore.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply