December 15, 2003 at 4:28 pm
I have a bulk insert (via a stored procedure - not BCP) process that works perfectly for 25 rows taken from a live data file. When I attempt to run the same bulk insert against the live 50 meg data file I get the following errors. The input file is a CSV file with 21 columns. Again, the code works perfectly with a smaller input file.
I suspect a truncation of the input stream due to the file size. Is there an upper limit on the input file size to bulk insert? Is the size limit documented anywhere? Thanks in advance for any assistance.
Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1, column 21. Make sure the field terminator and row terminator are specified correctly.
Server: Msg 7399, Level 16, State 1, Line 1
OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error.
OLE DB error trace [OLE/DB Provider 'STREAM' IRowset::GetNextRows returned 0x80004005: The provider did not give any information about the error.].
The statement has been terminated.
December 15, 2003 at 4:38 pm
I would suspect a dodgy or incorrectly set terminators. I've successfully used BULK INSERT to import files that are 100's of Mb's.
See if you can import the file using the BULK INSERT task in DTS.
Hope this helps
Phill Carter
--------------------
Colt 45 - the original point and click interface
--------------------
Colt 45 - the original point and click interface
December 15, 2003 at 6:47 pm
Thanks Phill,
You have eliminated one of the variables for me. i.e. the size of the file. I'll focus my attention on the data tomorrow morning. Thanks for the fast assist, and have a great day!
December 16, 2003 at 8:25 am
One thing you could do is use DTS to ensure that your file is in the correct format. Sometimes, this will point you to the area of your file where there are problems.
Steve
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply