ssis package fails due to potential loss of data

  • Hi all

    I have a ssis job that imports data from a csv file in to ole db destination. Th fields in the flatfile is seperated by the delimiter ####

    Eg data :

    123###abcd###55###42199

    655###defg######45566

    456######44###34567

    567###null###45###null

    ### is the column delimiter The job runs successfully when running locally on my desktop. However the same package fails when running from server.

    the failure is the value could not be converted due to potential loss of data.

    all datatypes match between source and destinations and runs fine locally but not in the server.

    Is this anything due to text qualifiers or any idea to over come the issue.

    thanks

  • This usually happens not because of your delimeter, but because the source metadata is assuming x length, and your target is x-n, so the concatonation throws up warnings.

    The best way to check this out is to doubleclick the lines between the components and read the metadata on the passed information from the source, and see what sizes the fields are. If you've got a large field going into a smaller field, it can complain about this. Another problem can be BIGINT going into INT.

    The easiest way to avoid this is to figure which field(s) are the problem, then use a Derived Column to copy over the column and adjust the datatype/length (ie: (DT_STR,1252)SUBSTRING(field),1,x) to fit the target.


    - Craig Farrell

    Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.

    For better assistance in answering your questions[/url] | Forum Netiquette
    For index/tuning help, follow these directions.[/url] |Tally Tables[/url]

    Twitter: @AnyWayDBA

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply