Viewing 15 posts - 1 through 15 (of 33 total)
Thank you. Both solutions worked very well.
November 5, 2021 at 6:26 pm
Thanks for both of your responses. Much appreciated. I will try both of them and see what works best for my problem case.
August 12, 2021 at 4:41 am
Sorry about the late response but this works for what I am trying to do. I will try and expand on it. Thanks guys
December 16, 2013 at 11:57 am
Thanks for the reply and the article.
To answer your question, the query is simply trying to delete row by row based on a certain join condition. Its currently using...
October 24, 2013 at 10:49 am
Hey Luiz, thanks for your response.
You are right, the data is really something I just made up and created on the fly, you are right there is no change...
October 24, 2013 at 10:19 am
Thanks for your response. How do I replace the query above with a set based? What exactly would that look like?
I am not worried about seeing a performance difference...
October 24, 2013 at 10:04 am
Wow, interesting. Nice article by Jeff as always.
Thanks guys. I will modify this to suit my needs.
March 7, 2013 at 11:20 am
Thanks for the prompt response. Yes your assumptions are correct.
Can you clarify this statement. I think it contains some special characters
SELECT COUNT(*) + 1 FROM #temp1 WHERE DataDifference >...
March 7, 2013 at 12:41 am
Oh not at all, I can do it any way I like, it was just something I noticed that both solutions had in common. Thanks guys
June 9, 2011 at 8:50 am
Both great solutions, I appreciate them. Seems like there is almost no way to get around using the "Having" clause. Interesting.
June 9, 2011 at 8:44 am
Thanks for the responses
Mr and Mrs. 500, the data in there is just something I made up, the duplicates do not matter (or does it?)
Lowell, the logic is to update...
August 24, 2010 at 9:44 am
Thank you very much for your responses. They have been taken into consideration.
March 15, 2010 at 12:42 pm
That worked like a charm. The reason it didn't work for me previously was that I used the dense_rank () over (partition by id). I have learned a lot from...
January 14, 2010 at 11:26 am
Thanks but I tried the dense_rank and it didn't work for me. Great piece of information learned though.
January 14, 2010 at 9:42 am
Viewing 15 posts - 1 through 15 (of 33 total)