Run an update using Ole DB Destination

  • Here's the scenario: Within a data flow task, I have a Flat File Source that reads a text file containing some supplemental data (an additional field value) for one of my existing tables. If possible, I want to run an update on the existing table without using a staging table for this. I can use a SQL command within the Ole DB Destination, but I can't find a way to refer to the incoming data in my T-SQL. Is there a way to reference the virtual table coming down the wire from my Flat File Source to allow me to run this update inline?

    Thx,

    Tim

    Tim Mitchell, Microsoft Data Platform MVP
    Data Warehouse and ETL Consultant
    TimMitchell.net | @Tim_Mitchell | Tyleris.com
    ETL Best Practices

  • Yes - it can be done, but don't.

    Use an OLE DB Command task instead. It is much easier to use and meant for what you are doing.

    You do not have to use the output from the component. If it makes you feel better, you can even go into the advanced editor and delete the output buffer.

  • Thanks Michael - I'm trying to avoid using this method, I'd rather just process the data inline. You say that there is a way to do this - can you indicate how (even though it is ill advised 😉 )

    Tim

    Tim Mitchell, Microsoft Data Platform MVP
    Data Warehouse and ETL Consultant
    TimMitchell.net | @Tim_Mitchell | Tyleris.com
    ETL Best Practices

  • The OLEDB Command will be in-line.

    When you are in your data flow, you are currently dragging the OLE DB Destination from the Data Flow Destinations components.

    Instead of this component, scroll up and take the OLE DB Command from the Data Flow Transformations.

    Attach your data flow to this component and you can use a parameterized query:

    UPDATE MyTable SET MyField = ? WHERE MyOtherField = ?

    Then you can wire your parameters to your data flow.

    The transformation component has an output, but you do not need to connect it to anything.

  • Here are some pictures:

  • Thanks.... for some reason I went brain dead for a minute and chased this all the way back to the SQL Command task in the Control Flow. The Ole DB command component should work quite nicely.

    By the way, what is that Flow Sync component in your example? I've not seen that before. I do have the Trash Destination adapter plug-in, which is very handy.

    Tim Mitchell, Microsoft Data Platform MVP
    Data Warehouse and ETL Consultant
    TimMitchell.net | @Tim_Mitchell | Tyleris.com
    ETL Best Practices

  • The FlowSync came from here:

    http://www.sqlbi.eu/Default.aspx?tabid=74

    It is to make sure the two data flows go into the TableDifference component at the same speed (for memory reasons). The TableDifference component (which is actually what the link above is really to) compares two data flows as long as they have the same field names and data types and gives you the rows that are new, deleted, or changed into different data flows.

    You can do this with the SCD wizard or with a Merge Join and a conditional split, but the SCD wizard is pretty slow on large data sets and the Merge Join / Conditional Split approach is pretty ugly to set up.

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply