Data Flow Task needs to be refreshed?

  • Okay, so I have a package that connects to a DB2 database, pulls down copies of a bunch of its tables and then does a bunch of massaging to the data on the SQL Server.

    I have about 12 tables that get copied from DB2 and they all have the same basic Data Flow Task:

    1. Use a ADO connection to the DB2 database as the source

    2. Go through a Data conversion step to reformat the string values

    3. Send to an OLE DB connection for the SQL server.

    On a regular basis.....every stinking day...when the package runs, it errors on 1 or 2 steps. There's no real error involved. All I have to do is open up the package, go into the Data Flow Task for the step that is erroring. Review the column mappings. Go back to the Control Flow and run it again and then it'll work just fine.

    I don't change anything. The DB2 source table for the Flow Task doesn't change. Nothing changes as far as I can tell.......however, it is a requirement to open up that Data Flow Task and refresh it before the package will run successfully.

    Can anyone tell me any possible solutions to this? Or, at least, what the problem might be?

    Thanks.

  • My guess is that the source schema is changing. You should log errors, but I think you probably have a rather poor DB2 driver (are there any good ones?) and it inferrs data types based on some portion of the data it gets. The data is a little different so you get something like a shorter character field than SSIS expects and you get an error.

    If you log your errors, you should get indications of this from data type errors.

    In your source, CAST or CONVERT (if possible) every field to ensure your driver always reports the correct data types.

    Also, turn off the ValidateExternalMetaData option on your data source components.

    If you log your errors and it is not some kind of a data type issue, post the errors.

  • Tried adding a CAST to all columns returned in the source (which is an ADO.NET source pointing to the DB2 database) and still got the error when it ran as scheduled. Again, I could run it from my desk in debug mode but as soon as it gets run as scheduled at night, it errors. Even versions that have just been sitting on my hard drive over night won't run right away the next morning. I need to open them up and refresh that step.

    I've talked to all people who might possibly be doing something to the source table in question but everyone says that nothing is changing with that table.

  • Oh, and there is a data conversion step between the source pull and the destination push. All it really does it turns the DB2 string output into a string [DT_STR] so that the SQL Destination table can understand (and similarly with dates).

  • Save a copy of your SSIS package while it is broken and then again once you have "refreshed" your connection and then compare the two packages (they are just XML files).

    It may help you to determine exactly what is getting changed in the package.

  • ooo...nice trick...I like the sound of that.....

    what do you use to compare them? Just a text editor....probably even something like Beyond Compare should work....unless there's some decryption in front of the XML, right?

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply