Large data load

  • We have an asp.net ap that generates 12 million rows of data that needs to be inserted into a single table. The ap does a calculation takes about 20 minutes to run, but the insert takes the better part of 3 hrs. I've tried several approaches none which have been suitable.

    First I had the developer pass me the data in xml format 200,000 rows at a time (larger datasets failed on the web server where the ap is running.) This took roughly 3.5 hrs to load.

    Next I had the developer pass me data row by row(I wasn't too concerned with contention here... there's only 2 users for this ap.) This took roughly 3 hrs to load.

    I want to try using BCP now. Is there a way to load via bcp directly into table from a file in byte format ?

    Does anyone have any other suggestions on how to handle this ? I thought about using horizontally partitioning the data and loading it via multiple threads into several different tables, but we have performance issues on the web server with this approach.

    TIA,

    MKulangara


    Mathew J Kulangara
    sqladventures.blogspot.com

  • large data loads normally go quicker in batches, especially if you can change the database mode to simple or bulk logged. Make sure you're not getting filegrowths as part of your load, these can typically slow a process.

    [font="Comic Sans MS"]The GrumpyOldDBA[/font]
    www.grumpyolddba.co.uk
    http://sqlblogcasts.com/blogs/grumpyolddba/

  • You may use BULK INSERT. That decreases load time significantly.

    Zubeyir

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply