December 18, 2008 at 1:49 pm
I'd like to be able to copy out about 1Gb's worth of data out of a huge database into a development environment.
There are though 100's of tables and views.
Whats the easiest way to go about this?
December 18, 2008 at 2:47 pm
I dont know a query to pull exactly 1 GB, 100KB data considering the entire tables...
Do you want to take out sample records from each table?
whats the purpose?
December 18, 2008 at 2:59 pm
Cloning the live database which is enormous would stain our network. Ideally we want say, 100 rows from each table into the corresponding table in an empty database.
Is this possible without taking the live system down?
December 18, 2008 at 3:24 pm
If you intend to create a sort of UAT or Test environment, you should backup your production database and restore that on the second server.
It is possible to pull out 100 records tables, however there will be issues!! If i had to, I would write generic queries...
1. identity columns would create issues
2. foreign key references would be a problem(you cant be sure if all parent records from sample recordset in table B is necessarily going to Table A as a part of that 100 records)
You would also compromise on table structures and constraints using this. You still need to create views, stored procedures, indexes, triggers and other objects.
I'd suggest take a backup and restore it on secondary!
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply