March 13, 2008 at 11:37 am
I have several flat (.csv) files that I need to load into a table for additional work, each of these files can have as many as 1M records; okay, so far no big deal.
When I try to use the Bulk Load I keep getting, "Cannot bulk load. The file "C:\TZ.csv" does not exist."
Okay, I'll just use a data flow with a flat file source and a db destination, to copy the data. However, after 20K rows, this stops. No errors, no warnings, it just stops; but has successfully copied 20K rows.
Does anyone have hints as to how to use Bulk Load, or why the 'data flow' route only handles 20K rows at one time?
Tim
March 13, 2008 at 11:24 pm
The reason you're getting the file does not exist error is probably because of permissions. What ever user you SQL Services start up as, must be able to "see" the directory you trying to load. If it's on a machine other than the server, you must use a UNC path with \\machinename\sharepath.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 14, 2008 at 9:53 am
I am using the same source and destination files in both the Bulk Load and Data Flow. Is there a reason the data flow would stop after 20K rows?
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply