Bulk Insert Failed

  • Hi

    I am creating an SSIS package where I need to load multiple files.  I prefer to do the SSIS work in C# so I have better control over certain parts of the SSIS package.  I came across the Bulk Insert function.  I would like to create a stored procedure for bulk insert and then run it in C#.  I tested the function in SS to make sure I got the parameters correct.  The bulk insert keeps failing.  If I use the Import Data wizard then I can import the data no problem.  This seems to suggest that the file is ok so bulk insert is probably missing some details that I didn't provide.  What is wrong with the bulk insert?  Thank you

    use myDB
    go

    bulk insert dbo.TableTemp
    from 'C:\Users\Me\Temp_Files\File.csv'

    error:

    Msg 4866, Level 16, State 1, Line 4
    The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.
    Msg 7399, Level 16, State 1, Line 4
    The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
    Msg 7330, Level 16, State 2, Line 4
    Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

    Completion time: 2022-02-01T14:39:41.0001529-08:00

     

     

  • Hello water490.

    I do not see a row terminator, did you use one?

    If the csv extract is coming from a unix based O.S. use '0x0a' as terminator if not, you can use '\r\n'.

    From MS Docs:

    https://docs.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-ver15

    BULK INSERT Sales.Invoices
    FROM '\\share\invoices\inv-2016-07-25.csv'
    WITH (FORMAT = 'CSV'
    , FIRSTROW=2
    , FIELDQUOTE = '\'
    , FIELDTERMINATOR = ';'
    , ROWTERMINATOR = '0x0a');

    • This reply was modified 2 years, 9 months ago by  Arun Yadav.
  • Hi

    Thanks for the reply.  It still doesn't work:

    USE MyDB
    GO

    BULK INSERT dbo.TableTemp
    FROM 'C:\Users\Me\File.csv'
    WITH (FORMAT = 'CSV', FIRSTROW = 2, FIELDQUOTE = '\', FIELDTERMINATOR = ';', ROWTERMINATOR = '\r\n');

     

    Msg 4866, Level 16, State 9, Line 4
    The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.
    Msg 7301, Level 16, State 2, Line 4
    Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)".

    Completion time: 2022-02-01T19:18:51.8841276-08:00
  • Hi

    I fixed one error.

    WITH (FORMAT = 'CSV', FIRSTROW = 2, FIELDQUOTE = '\', FIELDTERMINATOR = ',', ROWTERMINATOR = '\r\n');

    Now, I am getting this error:

    Msg 7301, Level 16, State 2, Line 4
    Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)".

    Completion time: 2022-02-01T20:10:17.4722524-08:00

    Any suggestions?

  • .

    • This reply was modified 2 years, 9 months ago by  water490.
    • This reply was modified 2 years, 9 months ago by  water490.
  • can you give us a sample file?

     

  • frederico_fonseca wrote:

    can you give us a sample file?

    No problem.

    • This reply was modified 2 years, 9 months ago by  water490. Reason: removed link
  • script to create table:

    /****** Object:  Table [dbo].[TableTemp]    Script Date: 2022-02-02 8:09:25 AM ******/
    SET ANSI_NULLS ON
    GO

    SET QUOTED_IDENTIFIER ON
    GO

    CREATE TABLE [dbo].[TableTemp](
    [UNDERLYING_SYMBOL] [nvarchar](10) NOT NULL,
    [QUOTE_DATE] [date] NOT NULL,
    [ROOT_SYMBOL] [nvarchar](10) NOT NULL,
    [EXPIRATION] [date] NOT NULL,
    [smallint] NOT NULL,
    [OPTION_TYPE] [nchar](1) NOT NULL,
    [OPEN_PRICE] [numeric](10, 5) NOT NULL,
    [HIGH_PRICE] [numeric](10, 5) NOT NULL,
    [LOW_PRICE] [numeric](10, 5) NOT NULL,
    [CLOSE_PRICE] [numeric](10, 5) NOT NULL,
    [TRADE_VOLUME] [numeric](10, 5) NOT NULL,
    [BID_SIZE_1545] [numeric](10, 5) NOT NULL,
    [BID_1545] [numeric](10, 5) NOT NULL,
    [ASK_SIZE_1545] [numeric](10, 5) NOT NULL,
    [ASK_1545] [numeric](10, 5) NOT NULL,
    [UNDERLYING_BID_1545] [numeric](10, 5) NOT NULL,
    [UNDERLYING_ASK_1545] [numeric](10, 5) NOT NULL,
    [IMPLIED_UNDERLYING_PRICE_1545] [numeric](10, 5) NOT NULL,
    [ACTIVE_UNDERLYING_PRICE_1545] [numeric](10, 5) NOT NULL,
    [IMPLIED_VOLATILITY_1545] [numeric](10, 5) NOT NULL,
    [DELTA_1545] [numeric](10, 5) NOT NULL,
    [GAMMA_1545] [numeric](10, 5) NOT NULL,
    [THETA_1545] [numeric](10, 5) NOT NULL,
    [VEGA_1545] [numeric](10, 5) NOT NULL,
    [RHO_1545] [numeric](10, 5) NOT NULL,
    [BID_SIZE_EOD] [numeric](10, 5) NOT NULL,
    [BID_EOD] [numeric](10, 5) NOT NULL,
    [ASK_SIZE_EOD] [numeric](10, 5) NOT NULL,
    [ASK_EOD] [numeric](10, 5) NOT NULL,
    [UNDERLYING_BID_EOD] [numeric](10, 5) NOT NULL,
    [UNDERLYING_ASK_EOD] [numeric](10, 5) NOT NULL,
    [VWAP] [numeric](10, 5) NOT NULL,
    [OPEN_INTEREST] [numeric](18, 5) NOT NULL,
    [DELIVERY_CODE] [nvarchar](10) NOT NULL,
    [TIME_STAMP] [datetime2](7) NOT NULL
    ) ON [PRIMARY]
    GO

    ALTER TABLE [dbo].[TableTemp] ADD CONSTRAINT [DF_TableTemp_TIME_STAMP] DEFAULT (getdate()) FOR [TIME_STAMP]
    GO


  • There is one extra column in the table that is not found in the csv but I was able to import the file using Import Data wizard.  I also removed the column and then used bulk insert.  no luck.  So I added it back.

  • the bulk insert options you used were just a sample - you needed to adjust to your own needs.

    you also can't have extra columns on the destination table - if you need the timestamp to be added then you will need to load into a temp table (e.g. #temptable) with only the columns that are part of the file, and then select from this one onto your other table with extra columns.

    also your column defined as smallint is in reality a decimal - so you need to change its definition do decimal/numeric (10, 5) for example.

    on your case one of the following should work

    so your sample file has fields delimited by "," (comma) so your should have set FIELDTERMINATOR = ','

    the rowterminator is a CRLF - so either '\n' or '0x0D0A would work -

    BULK INSERT dbo.TableTemp
    FROM 'C:\Users\Me\File.csv'
    WITH (FIRSTROW = 2, FIELDTERMINATOR = ',', ROWTERMINATOR = '\n')

    BULK INSERT dbo.TableTemp
    FROM 'C:\Users\Me\File.csv'
    WITH (FIRSTROW = 2, FIELDTERMINATOR = ',', ROWTERMINATOR = '0x0D0A');

    in sql 2017 the following likely to work also - on my current PC I cant test it

    BULK INSERT dbo.TableTemp
    FROM 'C:\Users\Me\File.csv'
    WITH (FORMAT = 'CSV', FIRSTROW = 2, ROWTERMINATOR = '\n')

    BULK INSERT dbo.TableTemp
    FROM 'C:\Users\Me\File.csv'
    WITH (FORMAT = 'CSV', FIRSTROW = 2, ROWTERMINATOR = '0x0D0A');
  • frederico_fonseca wrote:

    the bulk insert options you used were just a sample - you needed to adjust to your own needs.

    you also can't have extra columns on the destination table - if you need the timestamp to be added then you will need to load into a temp table (e.g. #temptable) with only the columns that are part of the file, and then select from this one onto your other table with extra columns.

    also your column defined as smallint is in reality a decimal - so you need to change its definition do decimal/numeric (10, 5) for example.

    on your case one of the following should work

    so your sample file has fields delimited by "," (comma) so your should have set FIELDTERMINATOR = ','

    the rowterminator is a CRLF - so either '\n' or '0x0D0A would work -

    BULK INSERT dbo.TableTemp
    FROM 'C:\Users\Me\File.csv'
    WITH (FIRSTROW = 2, FIELDTERMINATOR = ',', ROWTERMINATOR = '\n')

    BULK INSERT dbo.TableTemp
    FROM 'C:\Users\Me\File.csv'
    WITH (FIRSTROW = 2, FIELDTERMINATOR = ',', ROWTERMINATOR = '0x0D0A');

    in sql 2017 the following likely to work also - on my current PC I cant test it

    BULK INSERT dbo.TableTemp
    FROM 'C:\Users\Me\File.csv'
    WITH (FORMAT = 'CSV', FIRSTROW = 2, ROWTERMINATOR = '\n')

    BULK INSERT dbo.TableTemp
    FROM 'C:\Users\Me\File.csv'
    WITH (FORMAT = 'CSV', FIRSTROW = 2, ROWTERMINATOR = '0x0D0A');

    Thank you so much!!

    It is really interesting how the two approaches are very different (Import Data vs Bulk Insert).  That column you mentioned has never been a problem when I use Import Data but it is a problem for Bulk Insert.  I learn something new everyday!  🙂

  • water490 wrote:

    Thank you so much!!

    It is really interesting how the two approaches are very different (Import Data vs Bulk Insert).  That column you mentioned has never been a problem when I use Import Data but it is a problem for Bulk Insert.  I learn something new everyday!  🙂

    that is because the import data uses sql objects to load the data - so it maps each column (input and output) available on both sides and the remaining ones are ignored.

    you could also do it in C# - which if this is part of the sftp script would likely be easy enough to implement. and on this case you could have the extra columns.

  • frederico_fonseca wrote:

    you also can't have extra columns on the destination table

    With the understanding that I've not read all the posts in this thread, you actually can.  Then you can do one of two things... either create a BCP Format File to "skip" columns ( https://www.sqlservercentral.com/articles/skipping-columns-other-tricks-using-bcp-format-files-sql-spackle-1 ) or create a view to map the columns you're importing and do the BULK INSERT into that view instead of the table.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I used powershell to create a script that prepares the files for bulk insert.  it ensures that the columns match with the destination table.  works beautifully.

  • water490 wrote:

    I used powershell to create a script that prepares the files for bulk insert.  it ensures that the columns match with the destination table.  works beautifully.

    You mean a poweshell script that writes to another file?  If so, that seems like a waste.  I can see having a PoSh that creates the proper BCP format file, which requires no duplication of data on disk.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 15 posts - 1 through 15 (of 20 total)

You must be logged in to reply to this topic. Login to reply