November 20, 2009 at 5:02 am
Hi,
my table structure is
id mail date
1 kk@y.com 15/4/09
1 null 15/5/09
1 null 20/4/09
.............................
now, i have to find out the maximum date, and on that date if i have a null value for the mail then should take the 2nd maximum value and again if i found null over there go to the 3rd maximum value till i get some valid data.
how to do this
thanks,
regards
anamika
November 20, 2009 at 5:16 am
Can we have some test data in consumable format please?
---------------------------------------------------------------------------------
November 20, 2009 at 5:32 am
If the following won't give you the expected result you need to follow Nabha's advice and give us something to test against.
For details on how to post sample data please see the first link in my signature.
SELECT id, max(mail) ,max(date)
FROM table
WHERE mail is not null
GROUP BY id
November 20, 2009 at 5:50 am
Email1 modifieddate mergeid1
NULL Sep 15 2009 4:02PM1
kk@aii.edu Jun 16 2009 12:44PM1
na@aol.com Jun 4 2009 11:28AM1
NULL Apr 8 2009 10:40AM1
NULL Aug 26 2008 1:11PM1
lm@wc.com Nov 24 2008 8:47AM1
NULL Jan 16 2008 12:40AM2
NULL Jan 16 2008 12:40AM2
NULL Jan 16 2008 12:40AM2
NULL Jan 16 2008 12:40AM2
wnleiaa@a.com Sep 2 2009 2:33AM2
NULL Jan 16 2008 12:40AM2
NULL Jul 31 2008 12:00AM3
NULL Jul 31 2008 12:00AM3
November 20, 2009 at 6:11 am
Due to the fact that you've spent quite a lot of time on SSC I thought you would know what a 'consumable' formt is.Even Lutz's link would have directed you.
Ok lemme help 🙂
Create table #t(id int, mail varchar(50), date datetime)
Select * from #t
INSERT INTO #t VALUES (1,'kk@y.com' , '2009-04-15')
INSERT INTO #t VALUES (1,NULL , '2009-05-15')
INSERT INTO #t VALUES (1,NULL , '2009-05-20')
INSERT INTO #t VALUES (1,'kk@x.com' , '2009-04-14')
INSERT INTO #t VALUES (2,'kk@z.com' , '2009-04-15')
INSERT INTO #t VALUES (2,NULL , '2009-05-15')
INSERT INTO #t VALUES (2,NULL , '2009-05-20')
Select * from
(
Select *, row_number() OVER (Partition by id order by date
desc) as row_no
from #t
Where mail is not null
) t
Where row_no = 1
Is this what you want?
---------------------------------------------------------------------------------
November 20, 2009 at 7:13 am
Hi Nabha,
I'll stop guessing and step back until I see some expected results (now that you've done the first part of providing sample data 😉 ).
November 20, 2009 at 7:20 am
Sure Lutz, that one part(expected result) of the whole episode lies with the OP. Cant really help there 🙂
---------------------------------------------------------------------------------
November 22, 2009 at 2:10 am
Another way of achieving the same result...I am not sure about the nature of your ID column. It appeared to me to be a candidate for Primary Key but since it has repeating values it can't be that. So I am assuming it is just any other column.
DECLARE @t TABLE
(
ID INT,
MAIL VARCHAR(55),
EMAILDATE date
)
INSERT INTO @t
SELECT 1, 'A', GETDATE()
UNION ALL SELECT 1, 'B', DATEADD(dd, -1, getdate())
UNION ALL SELECT 1, 'C', DATEADD(dd, -10, getdate())
UNION ALL SELECT 1, null, DATEADD(dd, -11, getdate())
UNION ALL SELECT 1, null, dateadd(dd, 1, getdate())
UNION ALL SELECT 1, null, dateadd(dd, 2, getdate())
UNION ALL SELECT 1, 'C', DATEADD(dd, 3, getdate())
UNION ALL SELECT 1, null, DATEADD(dd, 4, getdate())
UNION ALL SELECT 1, null, DATEADD(dd, 5, getdate())
SELECT TOP 1 *
FROM @t
WHERE mail is not null
ORDER BY emaildate desc
Saurabh Dwivedy
___________________________________________________________
My Blog: http://tinyurl.com/dwivedys
For better, quicker answers, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537
November 22, 2009 at 9:59 pm
Hi all,
i apologize for not providing data in a consumable format. i'm really sorry.
thanks to Nabha, that is the result set i want.
Saurabh Dwivedy, your query returns only the top 1 record and the id is not taken into account.
thanks,
regards,
anamika
November 23, 2009 at 12:30 am
Anamika (11/22/2009)
Hi all,i apologize for not providing data in a consumable format. i'm really sorry.
thanks to Nabha, that is the result set i want.
thanks,
regards,
anamika
u r welcome 🙂 . In future, make sure u have the test data ready. Its easier for someone who's looking at it to work on.
---------------------------------------------------------------------------------
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply