Creating a Test environment for a 10TB database

  • Hi All,

    Working on SQL Server 2008 R2 Enterprise Edt.

    Need to create a testing environment for this production server. One of the database's is 10TB in size. Database consists of multiple file groups

    The issue I'm having is there's not enough space on the Test server to do a full restore.

    Was thinking of creating a blank database then exporting data from Production to Test (all the data is not required)...but this will obviously effect the production server (this environment is 24/7)

    Just wanted to check if there are any other better ways of doing this...with no downtime on Production

  • There are utilities that enable you to mount a backup file as a database without actually restoring it. So you could back up your live database, then mount the backup file on your test server and copy the data that you need from it to a permanent database.

    John

  • Another option for testing, but only for testing read operations, is to export the statistics from the production database and then import them into your test database. The query optimizer will then assume that it has millions of rows on empty tables and the behavior of execution plans will be very much like they are on production. This doesn't help insert/update/delete queries for testing because they're likely to lead to a statistics update which will then cause the real stats to be created on the system.

    Otherwise, some type of import is probably your best bet.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply