Flat file extraction error

  • I am using ssis 2008 and trying to extract data from a tab delimited file

    In the file above, I am getting an error where Counter=1494059

    I am using a derived column transformation and it has a formula '(DT_DBTIMESTAMP)Date'

    Due to that I am getting following errors:

    [Derived Column 1 [295]] Error: An error occurred while attempting to perform a type cast.

    [Derived Column 1 [295]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "component "Derived Column 1" (295)" failed because error code 0xC0049064 occurred, and the error row disposition on "input column "Date" (297)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.

    I tried to redirect erranous rows..but it is also not working

    Can anyone pls help to sort this one out..

    p.s. all the previous files were processed successfully, we are getting errors with this one file only..though it has identical structure and encoding

  • Rather than looking at the source data in Excel, can you please open it in Notepad++ or similar and then have a close look at the offending data?

    From your screen shot, everything looks fine. But Excel has a tendency to mask problems.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Now instead of Date and time conversion..I tried following:

    SUBSTRING(LTRIM(RTRIM([Date])),LEN(LTRIM(RTRIM([Date]))) - 3,LEN(LTRIM(RTRIM([Date])))) + "-" + SUBSTRING(LTRIM(RTRIM([Date])),1,FINDSTRING(LTRIM(RTRIM([Date])),"/",1) - 1) + "-" + SUBSTRING(LTRIM(RTRIM([Date])),FINDSTRING(LTRIM(RTRIM([Date])),"/",1) + 1,2)+" 00:00:00.000"

    STILL NOT WORKING 🙁

    I get the error 'An error occurred while evaluating the function. '

  • Now instead of Date and time conversion..I tried following:

    SUBSTRING(LTRIM(RTRIM([Date])),LEN(LTRIM(RTRIM([Date]))) - 3,LEN(LTRIM(RTRIM([Date])))) + "-" + SUBSTRING(LTRIM(RTRIM([Date])),1,FINDSTRING(LTRIM(RTRIM([Date])),"/",1) - 1) + "-" + SUBSTRING(LTRIM(RTRIM([Date])),FINDSTRING(LTRIM(RTRIM([Date])),"/",1) + 1,2)+" 00:00:00.000"

    STILL NOT WORKING 🙁

  • How are you so sure it fails on that specific line?

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • I checked it in the Data Viewer

  • rockstar283 (10/22/2013)


    I checked it in the Data Viewer

    I thought you said redirecting error rows didn't work?

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Yes, so there are no rows getting redirected on error output i.e. RED pipeline..and also the data viewer on Green pipeline is telling me where the error is

  • rockstar283 (10/22/2013)


    and also the data viewer on Green pipeline is telling me where the error is

    And why is that, if I may ask?

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Data viewer on the Green pipeline is showing me all the records before that particular erroneous record

  • rockstar283 (10/22/2013)


    Data viewer on the Green pipeline is showing me all the records before that particular erroneous record

    Sure? It might also show all the records loaded into the first buffer loaded when the error occurs.

    SSIS doesn't load data row by row.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • ok..I tried almost everything..but the derived columns seems to be giving problem..so I have resorted to this:

    I am creating a temp table..dumping the flat file as it is into it..and then using CONVERT in execute_sql task..I move the records over to the final destination table..

    I agree that it is one more extra step and will take some extra time..but it is getting the job done for now

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply