getting error while importing data from excel into sql server

  • Hi everyone,

    I am trying to import data from .xlsx excel file but i am getting this error:

    There was an error with output column "comments" (121) on output "OLE DB Source Output" (37). The column status returned was:

    "Text was truncated or one or more characters had no match in the target code page.".

    The excel file is from oracle and the data type for "comments" column is varchar2(2000byte) in oracle and varchar(2000) in sql server.

    when i searched for the solution someone told me to convert excel to flat file and i did that and created a package again but now i got this error:

    [Flat File Source [1571]] Error: Data conversion failed. The data conversion for column "COMMENTS" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".

    [Flat File Source [1571]] Error: The "output column "COMMENTS" (1672)" failed because truncation occurred, and the truncation row disposition on "output column "COMMENTS" (1672)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.

    [Flat File Source [1571]] Error: An error occurred while processing file "C:\Documents and Settings\Desktop\Copy of ABCD.txt" on data row 201.

    Can anyone please tell me what i need to do if i get the above error,

    shall i continue using excel or flat file as source?now both errors i am getting are from the source....

  • To trouble shoot yoor problem try importing the data in SQL table for comments as varchar(max) instead and test if it imports correctly.

  • Also check whether single quotes (') are there in comments column....

  • Try using NVARCHAR(MAX) as datatype for the comments column

    Prashant Bhatt
    Sr Engineer - Application Programming

  • I have tried using nvarchar data type but no use.

    and yes, there are single quotes and commas in the "comments" column.Now what should i do?

  • If single quotes are there, try replacing single quotes with some other special character, that should not be anywhere in comments column. (U may choose double quotes...).. I hope it should work.........

  • If you have single quotes and commas in your columns, then you will need to work out a different delimiter so the computer knows when to break for the next column. Where I work, our data processing guy likes to use the pipe character | as the delimiter because there is little chance it will show up in client data and throw off the columns. It makes for a few other inconveniences here and there in our processes, but they are minor compared to the gains given.

    If you can't change the delimiter, you might want to work out importing the data into a staging table with one row per column (import into a varchar(max) column), parse it out and handle any errors, then import it into the target table.

    There are plenty of good articles on how to parse a data field out on the site.

    -- Kit

  • Thank you ,i will try that.

  • how can i replace when i am using excel as import,so i have ignored the failure which ignored 2 rows.

    can you tell me how i should replace when excel is source?

  • U can make the changes in excel itself, before doing imports. (By selecting the comments column & use find/replace)...

  • srilu_bannu (7/19/2010)


    how can i replace when i am using excel as import,so i have ignored the failure which ignored 2 rows.

    can you tell me how i should replace when excel is source?

    What's the failure that ignored two rows? Did everything import except two rows? Or does the import fail totally?

    -- Kit

  • This is the error i got when excel is the source:

    There was an error with output column "comments" (121) on output "OLE DB Source Output" (37). The column status returned was:

    "Text was truncated or one or more characters had no match in the target code page.".

    After i got this error i have selected ignore failure in the source task ,and ran the package:

    excel has 666,7608 rows and the package imported 666,7606 rows into the database,2 rows got ignored.

  • If I'm understanding your question correctly, then yes, if you know the character that is causing the error, you can fix the error in Excel by replacing the incorrect character with the right one (or deleting it in the case of an extra quote or comma).

    This requires manual action. You have to open up the file, find it, replace it and then save the file again. This opens up the door to introducing other errors and human modification of the data before it hits your database. If this is an acceptable risk, then go ahead.

    Otherwise, you will have to build an SSIS package and some parsing code and some error handling code to do it in the database. Have the SSIS package import the file into a staging table that is one column of varchar(max). Then, parse out each row and put that into a different staging table. Code the error handling (some errors you might need to handle before going to the next staging table, some will need to be handled once the data is parsed.)

    From there, transfer to the original target table.

    This has the advantage of leaving your raw data intact. Also, if your raw data format changes (another column, etc.), modifications to the import code should be easy. If something suddenly becomes an error, it can be handled in the code, rather than having to be remembered each time you open up an Excel file to correct it before manually importing it.

    It does have the disadvantage of having to be built. It will require testing and coding and more testing before it is up and running. But after it is up and running, you won't have to do the import manually anymore.

    -- Kit

Viewing 13 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply