April 24, 2012 at 6:47 am
just hypothetical representation of text
c# .NET developer, TSQL
April 24, 2012 at 6:49 am
If i try your query I don't get any records?
c# .NET developer, TSQL
April 24, 2012 at 6:53 am
Can you replace the "SomeText" in the tables with the actual text.....so that we get to have a look at what you are actually looking at.
This would help us provide you the solution faster.
April 24, 2012 at 6:54 am
The_Mo (4/24/2012)
If i try your query I don't get any records?
did you change the table names. you should have gotten 2 errors if you just ran my query
EDIT: should be an OR instead of an AND in the where clause, fixed in original post.
For performance Issues see how we like them posted here: How to Post Performance Problems - Gail Shaw[/url]
Need to Split some strings? Jeff Moden's DelimitedSplit8K[/url]
Jeff Moden's Cross tab and Pivots Part 1[/url]
Jeff Moden's Cross tab and Pivots Part 2[/url]
April 24, 2012 at 7:09 am
The_Mo (4/24/2012)
just hypothetical representation of text
You need to explain your "match" and "no match" rules!
Are the match between data in tableOne and tableTwo is by DateValue only? By DateValue and Value?
By DateValue, Value and number of their occurrences?
April 24, 2012 at 10:25 am
You need to explain your "match" and "no match" rules!
Are the match between data in tableOne and tableTwo is by DateValue only? By DateValue and Value?
By DateValue, Value and number of their occurrences?
The thing is I have two table, table one and table two. as you can see the match is between DateValue and number of their occurences. The final result should have what you see in table result.
c# .NET developer, TSQL
April 24, 2012 at 10:58 am
This?
begin tran
; with cte as
(
select * , rn = ROW_NUMBER() over(PARTITION by DateValue ,Value order by DateValue) from #tableone
)
, DeleteCount AS
(
select DateValue , Value, ct = COUNT(*)
from #tableTwo
group by DateValue , Value
)
delete T1
--select *
from cte t1
join DeleteCount t2
on t1.DateValue = t2.DateValue
and t1.Value = t2.Value
and t1.rn <= t2.ct
--commit
--rollback
April 24, 2012 at 11:17 am
Almost SSCrazy ,
I have tweaked your code a bit and this is the result:
--===== table one
CREATE TABLE #tableOne
(
DateValue DATETIME,
Value varchar(50)
)
INSERT INTO #tableOne(DateValue,Value) VALUES('2012-04-29 08:30:00','SOME TEXT');
INSERT INTO #tableOne(DateValue,Value) VALUES('2012-04-29 08:30:00','SOME TEXT');
INSERT INTO #tableOne(DateValue,Value) VALUES('2012-04-29 08:30:00','SOME TEXT');
INSERT INTO #tableOne(DateValue,Value) VALUES('2012-04-29 09:00:00','SOME TEXT');
INSERT INTO #tableOne(DateValue,Value) VALUES('2012-04-29 16:00:00','SOME TEXT');
--===== table two
CREATE TABLE #tableTwo
(
DateValue DATETIME,
Value varchar(50)
)
INSERT INTO #tableTwo(DateValue,Value) VALUES('2012-04-29 08:30:00','SOME TEXT');
INSERT INTO #tableTwo(DateValue,Value) VALUES('2012-04-29 08:30:00','SOME TEXT');
INSERT INTO #tableTwo(DateValue,Value) VALUES('2012-04-29 09:00:00','SOME TEXT');
INSERT INTO #tableTwo(DateValue,Value) VALUES('2012-04-29 10:00:00','SOME TEXT');
--===== table result
CREATE TABLE #tableResult
(
DateValue DATETIME,
Value varchar(50)
)
INSERT INTO #tableResult(DateValue,Value) VALUES('2012-04-29 08:30:00','SOME TEXT');
INSERT INTO #tableResult(DateValue,Value) VALUES('2012-04-29 16:00:00','SOME TEXT');
with cte as
(
select * , rn = ROW_NUMBER() over(PARTITION by DateValue ,Value order by DateValue) from #tableOne
)
, DeleteCount AS
(
select DateValue , Value, ct = COUNT(*)
from #tableTwo
group by DateValue , Value
)
select *
from cte t1
left outer join DeleteCount t2
on t1.DateValue = t2.DateValue
and t1.Value = t2.Value
and t1.rn <= t2.ct
where t2.DateValue is null
drop TABLE #tableOne
drop TABLE #tableTwo
drop TABLE #tableResult
This is the result I wanted :w00t:
c# .NET developer, TSQL
April 24, 2012 at 11:34 am
what happened to '2012-04-29 10:00:00','SOME TEXT' ? or do you want only from table 1 thats not duplicated in table 2??
For performance Issues see how we like them posted here: How to Post Performance Problems - Gail Shaw[/url]
Need to Split some strings? Jeff Moden's DelimitedSplit8K[/url]
Jeff Moden's Cross tab and Pivots Part 1[/url]
Jeff Moden's Cross tab and Pivots Part 2[/url]
April 24, 2012 at 11:41 am
exactly:-D
c# .NET developer, TSQL
April 24, 2012 at 12:15 pm
The_Mo (4/24/2012)
exactly:-D
ok so my solution was wrong any way. here is the correct version of my solution
SELECT * FROM
(SELECT ROW_NUMBER () OVER (PARTITION BY DateValue,Value ORDER BY (SELECT NULL)) RowNum, *
FROM #tableOne) t1
LEFT JOIN
(SELECT ROW_NUMBER () OVER (PARTITION BY DateValue,Value ORDER BY (SELECT NULL)) RowNum, *
FROM #tableTwo) t2
ON t1.DateValue = t2.DateValue
AND t1.Value = t2.Value
AND t1.RowNum = t2.RowNum
WHERE t2.RowNum IS NULL
For performance Issues see how we like them posted here: How to Post Performance Problems - Gail Shaw[/url]
Need to Split some strings? Jeff Moden's DelimitedSplit8K[/url]
Jeff Moden's Cross tab and Pivots Part 1[/url]
Jeff Moden's Cross tab and Pivots Part 2[/url]
April 24, 2012 at 12:37 pm
Nice, works good. Thanks! I learned a lot :Wow:
c# .NET developer, TSQL
April 24, 2012 at 12:59 pm
Here is how I would approach this.
WITH CTE AS (
SELECT DateValue, [Value], ROW_NUMBER() OVER( PARTITION BY DateValue, [Value] ORDER BY (SELECT 1) ) AS rn
FROM #tableOne
EXCEPT
SELECT DateValue, [Value], ROW_NUMBER() OVER( PARTITION BY DateValue, [Value] ORDER BY (SELECT 1) )
FROM #tableTwo
)
SELECT DateValue, [Value]
FROM CTE;
Drew
J. Drew Allen
Business Intelligence Analyst
Philadelphia, PA
April 24, 2012 at 1:04 pm
drew.allen (4/24/2012)
Here is how I would approach this.
WITH CTE AS (
SELECT DateValue, [Value], ROW_NUMBER() OVER( PARTITION BY DateValue, [Value] ORDER BY (SELECT 1) ) AS rn
FROM #tableOne
EXCEPT
SELECT DateValue, [Value], ROW_NUMBER() OVER( PARTITION BY DateValue, [Value] ORDER BY (SELECT 1) )
FROM #tableTwo
)
SELECT DateValue, [Value]
FROM CTE;
Drew
Your code is excellent nice job! :smooooth:
c# .NET developer, TSQL
April 24, 2012 at 9:59 pm
Viewing 15 posts - 16 through 29 (of 29 total)
You must be logged in to reply to this topic. Login to reply