Loading LARGE Flat Files

  • I have several flat (.csv) files that I need to load into a table for additional work, each of these files can have as many as 1M records; okay, so far no big deal.

    When I try to use the Bulk Load I keep getting, "Cannot bulk load. The file "C:\TZ.csv" does not exist."

    Okay, I'll just use a data flow with a flat file source and a db destination, to copy the data. However, after 20K rows, this stops. No errors, no warnings, it just stops; but has successfully copied 20K rows.

    Does anyone have hints as to how to use Bulk Load, or why the 'data flow' route only handles 20K rows at one time?

    Tim

  • The reason you're getting the file does not exist error is probably because of permissions. What ever user you SQL Services start up as, must be able to "see" the directory you trying to load. If it's on a machine other than the server, you must use a UNC path with \\machinename\sharepath.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I am using the same source and destination files in both the Bulk Load and Data Flow. Is there a reason the data flow would stop after 20K rows?

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply