October 24, 2013 at 5:04 am
Hi Expert,
create table tmp (ID INT , NAME varchar(100));
insert into tmp values (1121, 'Tendulkar, Sachin [8691992],Sehwag, Virendra [8691991]');
insert into tmp values (1122, 'rrr, ttt [8691984],eee, rrr [8691986],ggg, jjj [8691982],kkk, uuu [8691987]');
insert into tmp values (1133, 'Ganguly, Sourav [8691980]');
I have to show the output as like below
ID NAME COUNT
1121 'Tendulkar, Sachin [8691992],Sehwag, Virendra [8691991]' 2
1122 'rrr, ttt [8691984],eee, rrr [8691986],ggg, jjj [8691982],kkk, uuu [8691987]') 4
1133'Ganguly, Sourav [8691980]' 1
Regards,
KRaj
October 24, 2013 at 5:31 am
Not exactly sure what you're trying to do. Here's my stab in the dark: -
SELECT ID, NAME, COUNT(*) AS [COUNT]
FROM tmp
CROSS APPLY [dbo].[DelimitedSplit8K](NAME,' ')
WHERE Item LIKE '%[[]%]%'
GROUP BY ID, NAME
ORDER BY ID;
Which produces: -
ID NAME COUNT
----------- ---------------------------------------------------------------------------------------------------- -----------
1121 Tendulkar, Sachin [8691992],Sehwag, Virendra [8691991] 2
1122 rrr, ttt [8691984],eee, rrr [8691986],ggg, jjj [8691982],kkk, uuu [8691987] 4
1133 Ganguly, Sourav [8691980] 1
Solution uses the 8K Splitter - http://www.sqlservercentral.com/articles/Tally+Table/72993/%5B/url%5D: -
CREATE FUNCTION [dbo].[DelimitedSplit8K]
--===== Define I/O parameters
(@pString VARCHAR(8000), @pDelimiter CHAR(1))
--WARNING!!! DO NOT USE MAX DATA-TYPES HERE! IT WILL KILL PERFORMANCE!
RETURNS TABLE WITH SCHEMABINDING AS
RETURN
--===== "Inline" CTE Driven "Tally Table" produces values from 1 up to 10,000...
-- enough to cover VARCHAR(8000)
WITH E1(N) AS (
SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1
), --10E+1 or 10 rows
E2(N) AS (SELECT 1 FROM E1 a, E1 b), --10E+2 or 100 rows
E4(N) AS (SELECT 1 FROM E2 a, E2 b), --10E+4 or 10,000 rows max
cteTally(N) AS (--==== This provides the "base" CTE and limits the number of rows right up front
-- for both a performance gain and prevention of accidental "overruns"
SELECT TOP (ISNULL(DATALENGTH(@pString),0)) ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) FROM E4
),
cteStart(N1) AS (--==== This returns N+1 (starting position of each "element" just once for each delimiter)
SELECT 1 UNION ALL
SELECT t.N+1 FROM cteTally t WHERE SUBSTRING(@pString,t.N,1) = @pDelimiter
),
cteLen(N1,L1) AS(--==== Return start and length (for use in substring)
SELECT s.N1,
ISNULL(NULLIF(CHARINDEX(@pDelimiter,@pString,s.N1),0)-s.N1,8000)
FROM cteStart s
)
--===== Do the actual split. The ISNULL/NULLIF combo handles the length for the final element when no delimiter is found.
SELECT ItemNumber = ROW_NUMBER() OVER(ORDER BY l.N1),
Item = SUBSTRING(@pString, l.N1, l.L1)
FROM cteLen l
;
October 24, 2013 at 6:17 am
Hi Expert,
thanks for your reply but its not worked
October 24, 2013 at 6:18 am
what is [DelimitedSplit8K] in your sql ?
October 24, 2013 at 6:31 am
kiran.rajenimbalkar (10/24/2013)
what is [DelimitedSplit8K] in your sql ?
Did you just copy and paste without reading what I wrote? Let me show you the relevant section that I'm talking about. . .
Cadavre (10/24/2013)
Solution uses the 8K Splitter - http://www.sqlservercentral.com/articles/Tally+Table/72993/%5B/url%5D: -
CREATE FUNCTION [dbo].[DelimitedSplit8K]
--===== Define I/O parameters
(@pString VARCHAR(8000), @pDelimiter CHAR(1))
--WARNING!!! DO NOT USE MAX DATA-TYPES HERE! IT WILL KILL PERFORMANCE!
RETURNS TABLE WITH SCHEMABINDING AS
RETURN
--===== "Inline" CTE Driven "Tally Table" produces values from 1 up to 10,000...
-- enough to cover VARCHAR(8000)
WITH E1(N) AS (
SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1
), --10E+1 or 10 rows
E2(N) AS (SELECT 1 FROM E1 a, E1 b), --10E+2 or 100 rows
E4(N) AS (SELECT 1 FROM E2 a, E2 b), --10E+4 or 10,000 rows max
cteTally(N) AS (--==== This provides the "base" CTE and limits the number of rows right up front
-- for both a performance gain and prevention of accidental "overruns"
SELECT TOP (ISNULL(DATALENGTH(@pString),0)) ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) FROM E4
),
cteStart(N1) AS (--==== This returns N+1 (starting position of each "element" just once for each delimiter)
SELECT 1 UNION ALL
SELECT t.N+1 FROM cteTally t WHERE SUBSTRING(@pString,t.N,1) = @pDelimiter
),
cteLen(N1,L1) AS(--==== Return start and length (for use in substring)
SELECT s.N1,
ISNULL(NULLIF(CHARINDEX(@pDelimiter,@pString,s.N1),0)-s.N1,8000)
FROM cteStart s
)
--===== Do the actual split. The ISNULL/NULLIF combo handles the length for the final element when no delimiter is found.
SELECT ItemNumber = ROW_NUMBER() OVER(ORDER BY l.N1),
Item = SUBSTRING(@pString, l.N1, l.L1)
FROM cteLen l
;
As you can see, I posted that my solution required you to add the 8K Splitter, which I posted a link to and included the code incase you didn't want to read the article. I'd advise reading the article, as it explains what the 8K Splitter does and how it does it extremely well, but it's entirely your choice.
October 24, 2013 at 6:52 am
Hi Expert,
thanks its worked but that was very complex.
cant we do directly with using that tmp table?
we need count of that name coulmn values
October 24, 2013 at 7:07 am
kiran.rajenimbalkar (10/24/2013)
Hi Expert,thanks its worked but that was very complex.
cant we do directly with using that tmp table?
we need count of that name coulmn values
Everything is complicated to someone that doesn't understand. My advice would be to read the article, learn what the code does and use that sort of idea.
You could do something like this: -
SELECT ID, NAME,
(LEN(NAME)-LEN(REPLACE(REPLACE(NAME,'[',''),']','')))/2 AS [Count]
FROM tmp;
But if your "NAME" ever contains a value like this - "What, Is [98432], my, n[am]e [43287]", it'll report "3" instead of what I assume is the desired "2".
October 24, 2013 at 7:13 am
Also, the posted question should be clear enough, with all required details.
October 24, 2013 at 7:18 am
Hi Expert,
thanks its worked for me...
sure In future I will take care about this
realy thanks Sir
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply