Importing Failure: What does this DTS error really mean?

  • I am trying to import a pipe delimited file into an existing table . Just to make sure that I do not have any truncation issues I have made all of the columns of the destination able nvarchar(1000) (i.e. overridden any suggestions about columns types because that suggestion feature does not work well for me.). I *know* none of the columns are greater than 1000 in length. Yet I still seem to get these errors when importing:

    * Error 0xc02020a1: Data Flow Task: Data conversion failed. The data conversion for column "Title" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".

    (SQL Server Import and Export Wizard)

    * Error 0xc020902a: Data Flow Task: The "output column "Title" (26)" failed because truncation occurred, and the truncation row disposition on "output column "Title" (26)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.

    (SQL Server Import and Export Wizard)

    Any help would be appreciated.

    Thanks,

    Patrick


    Kindest Regards,

    Patrick Allmond
    batteryfuel.com
    For all of your laptop, camera, power tool and portable battery needs.

  • Hi Patrick,

    When you create your connector for the import file, it actually defines the column types for each of your columns of data from the input source selected. Check the Source File Connector properties (this is if you are using the flat file Connection manager). In there you can go to the advanced tab and for each column of data that is being read in, check the DataType & output column width. This will be where your longer length is coming from.


    Kindest Regards,

    Catherine Eibner
    cybner.com.au

  • Thanks. I am now trying this with BULK INSERT and getting some different errors.

    This is pipe delimited file with 14 columns and there is a LF at the end of each line. There is a header row so I need to start at row 2. So my BULK import statements says:

    BULK INSERT dwm.bags

    FROM'C:\dwmdata\bags.pipe'

    WITH (BATCHSIZE = 1000,

    FIELDTERMINATOR = '|',

    FIRSTROW=2,

    ROWTERMINATOR = '''+CHAR(10)+'''

    );

    The IMPORT is dying with:

    Msg 4866, Level 16, State 8, Server devserver, Line 4

    The bulk load failed. The column is too long in the data file for row 1, column

    14. Verify that the field terminator and row terminator are specified correctly.

    I am using notepad++ to see the EOL character and it is clearly a single LF.

    What am I doing wrong?


    Kindest Regards,

    Patrick Allmond
    batteryfuel.com
    For all of your laptop, camera, power tool and portable battery needs.

  • Hi Patrick,

    Check to see if data value within any of the fileds in your file contain pipe symbol. We had a similar issue with a pipe delimited file where some of the data values within a field contained pipe symbol.

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply