Sql 2012 Ssis row by row processing

  • I am converting data from one db to another and one of the main tables is a master name tables. The issue that I have run into is that in my data flow task, it is processing in batches. I know this is optimal for the rest of my conversion, but for names it is not. The reason that I feel it is not is because I am using a look up component to see if that name is already in the new database. Now, that is good for names that are in subsequent batches but if that name is used multiple times in the first batch, I am getting duplicates. So my thought is to process the records one by one to eliminate the dups.

    I have scoured the Web for a way to do this and the closest thing I can find is to create a data flow task and push the data into a record set object. Is there a better way or hopefully a way to set it up so that the "batch" is only one record at a time?

    Thanks in advance for any help 🙂

  • Change the properties of your lookup component so that its lookup is not cached.

    It will hit performance hard, but it should do what you want.

    The absence of evidence is not evidence of absence.
    Martin Rees

    You can lead a horse to water, but a pencil must be lead.
    Stan Laurel

  • Phil Parkin (11/20/2014)


    Change the properties of your lookup component so that its lookup is not cached.

    It will hit performance hard, but it should do what you want.

    And change the OLE DB destination to not use the fast load as well. And still, if rows with the same customer name are in the same buffer...

    This really has to be tested well. And it will be the slowest package in SSIS history 🙂

    (I think using TSQL will be much more efficient here. Use DISTINCT and NOT EXISTS to get faster results)

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Thanks for the responses! I tried setting it to not cache and made sure I was not using fast load, however, that did not change anything as far as making the package run "row-by-row".

    My solution was to dump this whole approach and just write a stored procedure to loop through the data. It is slow but since I am moving away from writing my conversion programs in Visual FoxPro...even if it takes 5 hours to process 7mil records...that's faster than the 4 days the old FoxPro code would take.

    Thanks again!

  • teareal_1 (11/25/2014)


    Thanks for the responses! I tried setting it to not cache and made sure I was not using fast load, however, that did not change anything as far as making the package run "row-by-row".

    My solution was to dump this whole approach and just write a stored procedure to loop through the data. It is slow but since I am moving away from writing my conversion programs in Visual FoxPro...even if it takes 5 hours to process 7mil records...that's faster than the 4 days the old FoxPro code would take.

    Thanks again!

    It's possible you had to set the batch size to 1 as well in the OLE DB Destination, now that I think of it.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply