September 24, 2014 at 11:44 am
Hi,
I have a existing sql table with two columns. Those two columns have primary key (composite)
I need to insert data almost 6000 rows of excel or csv or text file into that table.
I don't want to drop that existing table and the table has pk so no need of duplicate data.
How can I do that insert data to existing sql table from text file without duplicates.
Immediate help would be appreciate
Thanks
September 24, 2014 at 1:59 pm
Load the new data into a temp table, and then use MERGE.
September 24, 2014 at 3:09 pm
Yes, I did but still it is showing that primary key violation.
Here the table has only two columns but both two columns are the primary key.
I use the statement
Merge table1 target
using #temp as ource
on target.id=source.id
and target.srno=source.srno
when not matched by target then
insert (id, srno)
values (id,srno)
but while running this statement still throwing error
violation of primary key constraint. The statement has terminated
September 24, 2014 at 3:13 pm
Hi,
Shall we do it with error handling? Can you give me some sample to do like that in this situation
September 24, 2014 at 3:29 pm
Do you have duplicates on the key in your text file/temp table?
September 24, 2014 at 3:33 pm
No idea it has almost 5,000 rows of text file
September 24, 2014 at 3:39 pm
I'd assume that's where your problem is with your insert statement. I normally wouldn't bother with a merge for something like this. If it's just the two columns you have and nothing else, I'd do this:
insert table1 (id, srno)
select distinct id, srno
from #temp t
where not exists (select * from table1 l where l.id = t.id and l.srno = t.srno)
September 25, 2014 at 5:05 am
The text file has duplicate rows.
After use the groupby clause, Merge statement works perfectly fine.
Thank you
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply