June 21, 2012 at 12:08 am
Hi,
We have one package where we am inserting into a OLEDB destination.
Often we get an error that unique contraint have been violated.
However, if we restart the package, it ends up successfully without any issue.
I am using SSIS 2005 and the error is as below:
"Cannot insert duplicate key row in object 'DMA_SUF.D_CUSTOMER_SOLD' with unique index 'IX_D_CUSTOMER_SOLD".
We would like to know why this ghost error 🙂
Thanks in advance
June 22, 2012 at 12:51 am
Do you select the exact same amount of rows every time?
Do you have some sort of mechanism that detects if rows are updates or inserts?
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
June 22, 2012 at 1:44 am
Hi Koen,
No we do not insert same amount of rows every time.
Its only inserting with no update.
It takes source from a table then does some lookups etc. Then finally it inserts into the final table.
Source table is a staging table where everyday new data is loaded from a file.
However, before it inserts it it checks if the data already exist in the dest table. If it does not exist then it continues.
The strange thing is when u restart the package everything goes on fine.
Thanks in advance for your help.
Ally
June 22, 2012 at 1:48 am
My guess is that you have duplicate rows in your source.
The first row inserts fine but at the second row you get the unique key violation error.
When you run the package again, both rows will be removed at the lookup component, as it already exists in the destination.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
June 22, 2012 at 5:39 am
Koen Verbeeck (6/22/2012)
My guess is that you have duplicate rows in your source.The first row inserts fine but at the second row you get the unique key violation error.
When you run the package again, both rows will be removed at the lookup component, as it already exists in the destination.
Excellent guess - yes, I've run into this one before. If you've used a cached lookup, it will not catch any dupes within the data to be inserted.
Best solution is to ensure that dupes are not in your source data rather than turning off caching in your lookups, as this really hurts performance.
The absence of evidence is not evidence of absence.
Martin Rees
You can lead a horse to water, but a pencil must be lead.
Stan Laurel
June 22, 2012 at 6:03 am
Unfortunately guys, several time I x-checked the source table. No duplicate was found.
June 22, 2012 at 6:32 am
Ally Abbas (6/22/2012)
Unfortunately guys, several time I x-checked the source table. No duplicate was found.
Apologies if this is obvious, but you're not looking for a full duplicate. Just a duplicate on the columns defined in the IX_D_CUSTOMER_SOLD index, whatever they are.
The absence of evidence is not evidence of absence.
Martin Rees
You can lead a horse to water, but a pencil must be lead.
Stan Laurel
June 22, 2012 at 6:43 am
could it be this reason:
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply