Refreshing Test with Production Data

  • Hi,

    I would like to know if there is an easy way to copy all tables, stored procedures etc, and data from production to test.

    I have SQL 2008 R2 installed on my laptop and would like to copy the data from our production environment to my laptop every week so that I can do my testing in the test environment. I would like to copy say 100000 rows of data, not all of it. Are there any good scripts out there to be able to do this?

    Thanks

    Greg

  • If you want all table and procedures, why not use the production database backup to restore the database on your laptop? you can automate this through a job also.

    If you just want to create the database, script the production database. Right click the database -> Tasks -> Generate Scripts. Once you have database created you can use Import\Export wizard to copy 100000 rows of any table. Right click production database -> Tasks -> Export Data. Select the destination as your laptop and you will get option to copy the table or write your own sql statement.

    Hope this helps you.

  • There's no easy way to move a sub-set of the data. You have to take into account that there are so many relationships to maintain between all the tables, you can't just select the first 10,000 from each table & be done because you'll have all sorts of violations of foreign key relationships, etc.

    Your basic options are:

    1) Backup & restore production: Easiest, but has lots more data

    2) Backup & restore production to another server, clean the data there: You can then back that up and distribute it around. Has the advantage that you can script the cleanup process and automate it. I've done this in the past and found it worked well.

    3) Just move the structures: There are a number of ways to do this, but you move the structure, then supply data later

    4) Do 3 and build data load scripts: You can automate the application of data to a structure. Hard part is maintaining it over time. I've done this one too. It worked extremely well, but requires a great deal of discipline to keep the data appropriately synced with the structural changes.

    5) Purchase 3 party tools: There are a number of different ones out there. For example two different tools from my company might address this issue. First we have a tool called SQL Data Generator that creates test data. You could take that in combination with #3 above. Second, we have a tool called SQL Virtual Restore that allows you to use a backup as if it were a database without actually creating a database. Saves tons of space on your client system, so you could access production backups with affecting the backup or having to have a system big enough to hold all of production.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • what you should be doing is to build yourself a dataset that would mimic your live data and use this database for development and possibly even testing. This allows dev and testing without placing dependencies upon your live and potentially sensitive dataset.

    Development is not just about spewing out reams of code, as part of the development lifecycle you would ideally create a model dataset that devs use to create the application.

    -----------------------------------------------------------------------------------------------------------

    "Ya can't make an omelette without breaking just a few eggs" 😉

  • Greg:

    Make sure you comply with your company's rules about copying any sensitive data.


    Peter MaloofServing Data

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply