December 7, 2009 at 7:26 pm
I have millions of records to archive from Server A to Server B, based on a date range. If using DTS, I know that in the DTS Data Transform task I can set a batch size of 10,000 or 20,000. How does DTS do that ?
I will probaly do it 1 month at a time (about 400,000 per month). If I archive outside of DTS, perhaps using T-SQL in a stored Proc or job, is there a way to batch records, or get the same benefit ?
December 7, 2009 at 8:20 pm
Yes, you can set it on the destination for SQL Server in the advanced properties or right on the OLE DB page.
CEWII
December 8, 2009 at 4:14 am
Sorry, I don't follow. Do you mean advanced properties of a job ?
December 8, 2009 at 6:49 am
No, within a dataflow you can use either an OLEDB destination or a SQL server destination. To get to the property we are looking for on the SQL Server destination you have to choose the advanced editor on the SQL Server destination within the dataflow.
Not sure if that is any clearer if not I'll build an example..
CEWII
December 8, 2009 at 8:07 am
I'm sorry, I might not have been clear. I'm familiar with how to set batch size in DTS. I was wondering if there's a way to control batch size when using other methods of transferring data in the same way that DTS does. And what are good alternatives ?
December 8, 2009 at 12:27 pm
Yes, in SSIS it is just as I described.
CEWII
December 8, 2009 at 12:42 pm
Hey Homebrew you may do it using BCP, please find more info in the link below.
Managing Batches for Bulk Import
Blog -- LearnSQLWithBru
Join on Facebook Page Facebook.comLearnSQLWithBru
Twitter -- BruMedishetty
December 8, 2009 at 8:23 pm
Does BCP and Bulk Insert work from SQL to SQL ? All the examples show .csv or some type of non-SQL file involved. I tried to write some code to bulk insert from 1 sql database to another, but didn't seem to work.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply