BCP? Is there a better way?

  • Hello All,

    I did not design this database, but I am now having to deal with it for a little while. I do have say in the design of the new one

    Perhaps the Title is a littl emis-leading. I am needing to load a history table from a staging table that is loaded from a select query. The select into returns apprx 2 billion rows. It does take a little while to pull all that data from many different tables. This takes a few hours to load.

    What I am looking for is a way to load my history table with the data that is in the staging table. However, since there are so many rows, I am looking for a way to load small amount of rows at a time, say load 100,000 rows and then commit, then load another 100,000 and do a commit.

    The only way that I know how is to pump the data out to a text file and then use BCP to pump it back in. That has a numeric commit for a certain number of rows. Does anyone know a way to perform this task with the data still in the staging table? Without pumping the data out in to text files?

    Thanks

    Andrew SQLDBA

  • Andrew,

    I think you're after the BULK INSERT T-SQL command, checkout BOL 'bulk insert' under 'Transact-SQL Reference.  You can then use the ROWS_PER_BATCH or BATCHSIZE arguments.

    NOTE: Be sure to size your target database log file large enough to avoid it having to automatically grow each time it reaches the log file size limit, this dramatically reduces bulk loading time.

    ll

  • Look at :

    1. DTS. Options "Fast_load" and "rows per batch".

    2. If you're a programmer (even just a little bit) - look at .NET 2.0 SQLBulkCopyClass.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply