Bulk Inserts

  • I have a project where i need to bulk insert a log test file into my table.

    That works great. however... out of every 1000 rows there are only 50 rows i actually want. is there a why to filter this out durning my insert?

    I can do afterwards but i am processing over 10 million records. takes a while...

    thanks

    [font="Comic Sans MS"]Being "normal" is not necessarily a virtue; it rather denotes a lack of courage.
    -Practical Magic[/font]

  • purplebirky (12/27/2010)


    is there a why to filter this out durning my insert?

    Most likely not. Bulk Insert doesn't have any way to measure by criteria. Of course, that's one of the many reasons why I load data into a Staging Table, first, then validate the data, and then move only the validated data to it's final resting spot.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply