Data Management

  • Lets just say i have 40m records to insert/update into the database...

    Currently i have a system that runs in full week to complete the process, anyone can relate?

    maybe give some advice whats the best practice to optimize this? algorithm or anything aside BCP or Bulk Insert since i also need to monitor all the records...

  • BCP is usually recommended for something like this. However, this is dependent on a lot of details such as file format, your window for import, etc.

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • hmm i have not actually dug deeper with bcp, can we monitor the records being inserted into the table? like error in each fields in a row, fk_constraints, pk_constraints, Null values in a not null fields?

    i know bcp is the way to go, but not sure if we can do all those monitoring, just like bulk insert.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply