Performance - Handling large tables using Data flow task - SQL TServer 2005

  • Hi,

    There is a large ( over 300 million ) table with lots of historical data and I need that to be trasfered from a production server to a secondary server. I'm using a SSIS package to trasfer this data. Can a data flow task handle this kind of load in one go ? Or would I be better off by moving part by part of the table. If this is better what would be a efficient number of records per go ?

    I'm concern since the source is a production server and I haven't used SSIS to handle this kind of load. Any advice is appriciated.

    Thanks

    Vip

  • Why not use bulk insert?

    insert * into server2.db2.dbo.table2 from table1

    Also, put the secondary server in bulk logged mode(if it's in full recovery model) which will prevent log files on secondary to grow dramatically...



    Pradeep Singh

  • The Data flow task in SSIS will do a INSERT BULK statement. I would follow pradeep's advice on the recovery model. Just optimize the options in the destinaion adapter: maximum rows and commit size and lock the table for fastest load.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply