July 20, 2011 at 12:19 pm
I agree that sample data would be useful.
That would help determine if the current design needs tweaked or not.
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
July 20, 2011 at 12:51 pm
It turned out that there were duplicates.. The problem was that I was trying to join 2 tables and that created duplicates. Creating composite key have solved the problem....
July 20, 2011 at 12:55 pm
Ah!!, glad you got it figured out.
July 20, 2011 at 1:03 pm
airparkroad (7/20/2011)
It turned out that there were duplicates.. The problem was that I was trying to join 2 tables and that created duplicates. Creating composite key have solved the problem....
That is good that you figured out how to load the date without a duplicate key violation but after quickly examining your schema IMHO it does not appear that it is in 3rd normal form but without a definition of the requirements and a good sample pool it is hard to say.
You may win the battle but you may loose the war. 🙂
You want to get the design correct from the start or it becomes a real headache.;-)
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
July 21, 2011 at 2:45 am
Welsh Corgi (7/20/2011)
airparkroad (7/20/2011)
It turned out that there were duplicates.. The problem was that I was trying to join 2 tables and that created duplicates. Creating composite key have solved the problem....That is good that you figured out how to load the date without a duplicate key violation but after quickly examining your schema IMHO it does not appear that it is in 3rd normal form but without a definition of the requirements and a good sample pool it is hard to say.
You may win the battle but you may loose the war. 🙂
You want to get the design correct from the start or it becomes a real headache.;-)
+1000
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
July 22, 2011 at 11:18 am
Just to add my 2 cents. When I get something like this then I know that there's either bad data in the database or data that I don't know all about or haven't thought about.
If I know the target table doesn't have anything in it that duplicates the source, then there must be duplicates in the source. The quickest way to find out is to take the SELECT part of the INSERT query and stick a wrapper around it and have a look. Something on the order of:
SELECT X.category_db_id, COUNT(*)
FROM
(
SELECT b.category_db_id
FROM code_raw a
JOIN category b ON a.categoryid = b.categoryid
JOIN schemes c ON a.codeschemeid = c.schemeid
) AS X
GROUP BY X.category_db_id
HAVING COUNT(*) > 1
This usually points out the error of my ways.
Todd Fifield
Viewing 6 posts - 16 through 20 (of 20 total)
You must be logged in to reply to this topic. Login to reply