January 7, 2009 at 1:55 pm
i have simple package which reads data from text file and loads to database table
ex customer text file to customer table (custID,custName,address)
here the problem is we have primary key in customer table(custID) how can i check text file data with data in customer table
if any custID is duplicate in file then remove it else load it to table
because some time we get text files (some records may duplicate)
some other times we may get two files with same data, so at the time package is failing (due to primary key in table)
please suggest me best possible ways
Best Regards
sita
January 7, 2009 at 2:24 pm
SSIS doesn't have very good tools to do this.
Sure it has the lookup transformation, but it doesn't help with duplicates within the file. And the lookup transform is a hog because it has to load the whole destination table into buffer.
when the data is crummy within the file it works well to import the data into a "Staging" table.
So create a table called Customer_Staging. with no constraints or primary keys.
Then load all the data into that table.
Then use sql tasks to insert into Customers where it doesn't exist. using whatever rules you need to make sure the data is clean and unique.
and if Needed another Sql Task to Update the records if they exist.
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply