Transform Task errors

  • I have a Data Transformation task  that extracts and transforms data from a text file into a Database table. The data in the text files is causing a Primary key constraint violation on the table during insertion. I have specified that source and destination errors rows should be written to separate file ( using the properties in the Exception File section of Transform task  - unchecked 7.0 format).

    However, the destination or source error rows are not getting written to any file. The error files indicates that there has been a constraint violation but doesn't tell me the particular row from source that caused the error.

    Following is what I see in the error log :

    Execution Started: 4/11/2005 11:52:32 PM

    @@LogSourceRows: C:\ELogic\CDS Feed Processing\CDSDataFeed\Transform_E_Records_Exceptions.txt.Source

    @@LogDestRows: C:\ELogic\CDS Feed Processing\CDSDataFeed\Transform_E_Records_Exceptions.txt.Dest

    @@ErrorRow:  59553

    Error at Destination for Row number 59553. Errors encountered so far in this task: 1.

    Error Source: Microsoft OLE DB Provider for SQL Server

    Error Description:The statement has been terminated.

    Error Help File:

    Error Help Context ID:0

    Error Source: Microsoft OLE DB Provider for SQL Server

    Error Description:Violation of PRIMARY KEY constraint 'PK_cds_demographic_record'. Cannot insert duplicate key in object 'cds_demographic_record'.

    Error Help File:

    Error Help Context ID:0

    @@SourceRow:  Not Available

    @@DestRow:  Not Available

    @@ExecutionCompleted

    Execution Completed: 4/11/2005 11:53:09 PM

     

     

    Why are the files Transform_E_Records_Exceptions.txt.Source  and Transform_E_Records_Exceptions.txt.dest not getting created?

     

    I would Appreciate any thoughts and suggestions from folks on this forum.

     

    Thanks

     

     

  • The exception file logging does not recognise the duplicate keys or referential integrity errors , so it does not create the .source and .dest exception files in this case

    Here is the excerpt from BOL

    Detecting Row-Level Errors

    Tasks that transform data use exception files to record information about failed rows. Exception files can contain:

    ·                     Package information, such as package name, description, and version.

    ·                     Step execution information, including the name of the package step associated with the data pump operation and step execution times.

    ·                     Error information, including the source of the error (for example, the data pump or a connection) and a description of the error (for example, an insert error that occurred on EmployeeData column, row 2007).

    The tasks that transform data are able to detect row-level errors before the row is submitted to the database. For example, suppose an input row contains missing or incorrectly formatted data. When these tasks encounter such a row, they fail the row and do not pass it to the destination. This error counts as one failure toward the maximum error count.

    Some errors, such as duplicate keys or referential integrity violations, cannot be detected at row level by these tasks. Such rows fail only after being passed to the destination. The failure is noted in the exception file, but the actual rows that failed are not logged. Thus, complete error information is not always available in the exception logs.

    If you configure an exception log for a task that transforms data, step execution information is appended to the exception file you specify each time the package is run. If you specify an exception file that does not currently exist, the file will be created at package execution time. If the step associated with the task does not run, no exception file data is generated.

     

     

    Regards

    Meghana

     


    Regards,

    Meghana

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply