Asynchronous Data Flow - How to add conditional wait

  • I have a data flow that will check imported data against the db and Insert new records.

    I decided to additionally insert records to a another table.

    So now I do the lookup as before but split the No Match Output, since the OLE DB destination does not have an output I can use to pass the data.

    This is fine, except the inserts into the new table require knowledge of the ID keys generated in the original table.

    Doing a 2nd lookup should work, provided that the original table inserts happen prior to my lookup.

    So how do I add a conditional so that the 2nd Lookup doesn't precede the Insert happening on the other side of the Multicast split?

    Is it only possible by splitting up into seperate Data flows and managing the order in the Control Flow?

    That was my original plan but it seemed like a hassle to keep track of which rows were added. If I'm forced to go this route can I store a temp data set to a variable or should I just dump to a temp table?

  • While I think it would be better if I could have the start of a lookup within a data flow dependend upon an insert finishing w/o being directly down the flow path I don't think it is possible.

    Therefore I think I'll use the Cache transform to save off the data and use a seperte data Flow to do the insert.

    The problem now is if I use a Lookup with a Cache Connection Manager it requires an input, right? But really I need the Cached data to be my source. How do I do that? It seems hokey to save the cache off to a .caw file just to read from it in a seperate data flow task.

  • Could you post an image of your data flow?

  • Adding attachment

  • If I understand your posts and image correctly, you needed to ensure that the "StageProviders" task to complete before the "Insert Providers" task started...but you still need the non-matching output from the "Lookup Provider" task for the last insert. Am I interpreting that correctly?

  • Martin Schoombee (8/18/2015)


    If I understand your posts and image correctly, you needed to ensure that the "StageProviders" task to complete before the "Insert Providers" task started...but you still need the non-matching output from the "Lookup Provider" task for the last insert. Am I interpreting that correctly?

    These are not tasks. They're transforms. The operate simultaneously, and asynchronously, on batches of rows. Each one will only be completed when it has processed all the rows of data.

    There's no easy way of accomplishing the desired results in a data flow. I would be tempted to import the data into temp tables and then do the rest via stored procedures.



    Alvin Ramard
    Memphis PASS Chapter[/url]

    All my SSC forum answers come with a money back guarantee. If you didn't like the answer then I'll gladly refund what you paid for it.

    For best practices on asking questions, please read the following article: Forum Etiquette: How to post data/code on a forum to get the best help[/url]

  • Alvin Ramard (8/18/2015)


    Martin Schoombee (8/18/2015)


    If I understand your posts and image correctly, you needed to ensure that the "StageProviders" task to complete before the "Insert Providers" task started...but you still need the non-matching output from the "Lookup Provider" task for the last insert. Am I interpreting that correctly?

    These are not tasks. They're transforms. The operate simultaneously, and asynchronously, on batches of rows. Each one will only be completed when it has processed all the rows of data.

    There's no easy way of accomplishing the desired results in a data flow. I would be tempted to import the data into temp tables and then do the rest via stored procedures.

    You're right...typo on my part...and agreed that it would be better to try this through some kind of a staging process and outside of the data flow.

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply