Looking for best practice to do an insert/update load from a flat file.

  • I'm loading a group of tables each with up to 100 mil rows. I'm looking for the best layout for an insert/update package.

    Currently I'm using a lookup into the table to bring back a value. Then I'm using a conditional split. If the value is null then I pass it to a bulk insert. If the value is not null I pass it to a SQL command where I do an update statement.

    The process works great and seems reasonably fast. That is until I loaded my first 30 mil rows on one of the tables. Now the lookup seems to be unbearably slow. Is there a better way to do this? Or do I need to go back to my old method of loading the whole table then using SQL to parse the inserts/updates. I tried using a Slowly Moving Dimension but that required the input to be sorted. And sorting the input file is even worse than doing the lookup.

    Kenneth

    Kenneth FisherI was once offered a wizards hat but it got in the way of my dunce cap.--------------------------------------------------------------------------------For better, quicker answers on T-SQL questions, click on the following... http://www.sqlservercentral.com/articles/Best+Practices/61537/[/url]For better answers on performance questions, click on the following... http://www.sqlservercentral.com/articles/SQLServerCentral/66909/[/url]Link to my Blog Post --> www.SQLStudies.com[/url]

  • I have used SCD transformation previously. But then again, as you are facing problem with performance and latency, SCD is much more slower doing that. I would recommend using some script in the script task if that would help you. I would have done the same thing you have done too!

    [font="Verdana"]Imagination is more important than knowledge-Albert Einstein[/font]

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply