March 28, 2015 at 8:12 am
Hi all,
I have configured 'Linked server' on Test Server to connect Prod. server. I want to copy a 'customer' table with 50 millions plus rows into Test server to do some testing.
My questions are:
1. "select * into 'TestTable' from prodServer.database.schema.tableName" Is this the best option to copy a table?
2. will there be any impact on Prod server if i run this command on TestServer (copy a table from Prod to Test server)?
3. Will there by any log generation on both Prod and test server? I have 'simple' recovery on Test server but full recovery on Prod. server
I need you guys suggestions. Please suggest me ONLY if you are absolutely sure whatever you suggest Because I will be doing this on Prod. server.
March 28, 2015 at 10:00 am
Initial Suggestions:
1.don't do it that way at all.
2. certainly don't take the advice of ANYONE on the net and run it on your prod environment without testing it if you value your career.
From my own experience - just trying to pull 50M rows over a linked server connection is going to be a BAD outcome. Hell - pulling 50M rows on the SAME server will tend to affect performance if you do it in one big batch.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
March 28, 2015 at 1:06 pm
Thank you much Matt!!! Is there any other options?
March 30, 2015 at 7:31 am
I would rather restore the database on my test server with a different name and use the import/export wizard and import the data using the query option, into the table in small batches based on date.
Hope this helps!
March 30, 2015 at 8:58 am
Tac11 (3/28/2015)
Thank you much Matt!!! Is there any other options?
Depending on what's feasible:
1. considering just restoring the product DB holding that table into your test environment, extract the data as needed then drop the restored DB from testing.
2. create a new DB in production, run a series of batch extracts "walking through" the customer data table and extracting a few thousand records at a time. Usually you can do that easily if you have some form of an identity column on the table you want. Doing it in batches will tend to not bog down access to the prod tables. Once you've created the copy, copy that particular data over to test (either through backup restore or detach/copy/attach).
A notional sample of a script to "walk your customer DB" might look like the following (you will need to adapt this to your situation, tables, etc....)
declare @startID int;
declare @batchsize int;
set @startid=0;
set @batchsize=50000; --you may need to tune this to work best in your environment. you're trying to balance performance against blocking.
--If the rowsizes are small, you could ramp up to 100,000; if the server is really jammed, perhaps drop it to as 10,000 or possibly lower as needed;
Insert into MyTemporaryDB.dbo.MyTempCustomerTable(mycustomercolumns)
select mycustomercolumns
from customer
where id between @startID and @startid+@batchsize -1
while @@rowcount>0
begin
set @startid=@batchsize;
Insert into MyTemporaryDB.dbo.MyTempCustomerTable(mycustomercolumns)
select mycustomercolumns
from customer
where id between @startID and @startid+@batchsize -1
End
Again - you REALLY want to look this over, understand it and "own" it so that you can run this safely in your environment without tearing the place up. Testing it in a non-prod environment would be the only sane way I would go about doing that, but that's ultimately your call.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply