February 17, 2012 at 5:38 pm
delete from
tableA
where pid not in
( select pid, CN, mk, te, yr, st, cr, cc from
tableA where st = 812 and te = 1 and YR = 11
group by PID, cn, mk, te, yr, st, CR, cc
having count (*) > 1 )
I am trying to run this query but geting this error message:
Msg 116, Level 16, State 1, Line 7
Only one expression can be specified in the select list when the subquery is not introduced with EXISTS.
CAn somebody help. Thank you so much.
February 17, 2012 at 5:44 pm
Change select pid, CN, mk, te, yr, st, cr, cc from
to select pid from
You cant throw multiple values values from a subquery while u use in WHERE clause. SELECT only the columns that you use for comparison, in your case, pid
February 17, 2012 at 6:01 pm
Only problem this is a table taht has all the student transcripts. There are several records for one ID. There are duplciates only when I inlucde all those fields. I am not sure how to limit to those duplciates and either update or delete them
February 17, 2012 at 6:35 pm
Please follow this article and provide us more information on what you need.
http://www.sqlservercentral.com/articles/Best+Practices/61537/
February 21, 2012 at 2:15 pm
Thank you for pointing out the forum etiquitte to me. I will keep that in mind from now on.
This is my problem.
I have a table that contains student transcripts. It has duplicates I am trying to figure out a way to delete the duplicates. The primary key in this table are PID which is the Permanent ID of the student and SQ which is a sequence number. So each student can have several records under one PID where the SQ is incremented. The Transcript table has several records for each student as this table contains the grades, credits for various classes, for various terms, years. The way to query the duplicates is running this query.
SELECT PID, Course#, mark, Credits, Grade, term, year, SchoolTaken FROM HIS WHERE ST = schoolcode AND
term = 1 and year = 10
GROUP BY PID, Course#, mark, Credits, Grade, term, year, SchoolTaken HAVING COUNT (*) > 1
Is there a way I can delete these duplicate records?
February 21, 2012 at 2:28 pm
Are you trying to delete all records but one for the PID or just duplicates?
February 21, 2012 at 2:31 pm
I am trying to delete all the duplicates. The duplicates are obtained using the query I listed earlier. The primary key for this table is PID and SQ (sequence number). There are several records for one PID as there are several records showing the student transcript.
February 21, 2012 at 2:34 pm
Do you have a created_on field or a similar field that records the date or datetime of entry?
February 21, 2012 at 2:39 pm
Yes, there is a field named DTS which has the date time stamp.
February 21, 2012 at 2:45 pm
http://support.microsoft.com/kb/139444
Give this link a shot. If you are still having issues shoot a reply.
February 21, 2012 at 2:46 pm
Try using ROW_NUMBER() OVER(PARTITION BY) - I think that would very well for you in this scenario
Something like this maybe?
DELETE FROM a
FROM
(SELECT pid, CN, mk, te, yr, st, cr, cc
, ROW_NUMBER() OVER (PARTITION BY pid, CN, mk, te, yr, st, cr, cc) RowNumber
FROM YourTable) a
WHERE a.RowNumber > 1
______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience
February 22, 2012 at 11:30 am
If you're going to use the ROW_NUMBER function to list the duplicates, you'll probable need a ORDER BY in the OVER clause as well.
February 22, 2012 at 11:33 am
Yes, you are correct. I am trying to figure this thing out as I am getting the error-'The ranking function "ROW_NUMBER" must have an ORDER BY clause'
February 22, 2012 at 11:37 am
Oops!!!
______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience
February 22, 2012 at 11:44 am
Only reason I caught it was I spent the last week writing these as part of an ETL load 😉
Viewing 15 posts - 1 through 15 (of 15 total)
You must be logged in to reply to this topic. Login to reply