December 19, 2013 at 5:43 am
Hi All,
I am having a column StartTime in my table which data looks like below.
StartTime
7:00 AM <br/> 12:30 PM
Now I need to split the above into two separate rows as below.
StartTime
7:00 AM
12:30 PM
How can I do this.
This is very urtgent please help me.
Thanks
Abhas.
December 19, 2013 at 6:01 am
This might help
DECLARE @StartTime VARCHAR(MAX),@X XML
SELECT @StartTime = '7:00 AM
12:30 PM'
SELECT @X = CONVERT(XML,'' + replace(@StartTime,'','</br>') + '</br>')
SELECT x.i.value('.','varchar(max)') AS [StartTime]
FROM @X.nodes('//br') x(i)
December 19, 2013 at 6:09 am
Sowbhari (12/19/2013)
This might help
DECLARE @StartTime VARCHAR(MAX),@X XML
SELECT @StartTime = '7:00 AM
12:30 PM'
SELECT @X = CONVERT(XML,'' + replace(@StartTime,'','</br>') + '</br>')
SELECT x.i.value('.','varchar(max)') AS [StartTime]
FROM @X.nodes('//br') x(i)
This is the correct SQL,somehow the tags are missing in my previous post.
DECLARE @StartTime VARCHAR(MAX),@X XML
SELECT @StartTime = '7:00 AM <br/> 12:30 PM'
SELECT @X = CONVERT(XML,'<br>' + replace(@StartTime,'<br/> ','</br><br>') + '</br>')
SELECT x.i.value('.','varchar(max)') AS [StartTime]
FROM @X.nodes('//br') x(i)
December 19, 2013 at 6:41 am
You could also create a function to do it:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE FUNCTION [dbo].[fnSplitList]
(
@sInputList VARCHAR(8000),
@sDelimiter VARCHAR(10)
)
RETURNS @List TABLE ( item VARCHAR(8000) )
BEGIN
DECLARE @sItem VARCHAR(8000)
WHILE CHARINDEX(@sDelimiter, @sInputList, 0) <> 0
BEGIN
SELECT @sItem = RTRIM(LTRIM(SUBSTRING(@sInputList, 1,
CHARINDEX(@sDelimiter,
@sInputList, 0)
- 1))) ,
@sInputList = RTRIM(LTRIM(SUBSTRING(@sInputList,
CHARINDEX(@sDelimiter,
@sInputList, 0)
+ LEN(@sDelimiter),
LEN(@sInputList))))
IF LEN(@sItem) > 0
INSERT INTO @List
SELECT @sItem
END
IF LEN(@sInputList) > 0
INSERT INTO @List
SELECT @sInputList -- Put the last item in
RETURN
END
GO
From there you just do the following:
SELECT * FROM dbo.fnSplitList('7:00 AM delimiter 12:30 PM','delimiter')
Output :
item
7:00 AM
12:30 PM
You can then be flexible on your delimiters 🙂
December 19, 2013 at 6:42 am
Try with Charindex
DECLARE @StartTime VARCHAR(MAX),@X XML
SELECT @StartTime = '7:00 AM
12:30 PM'
DECLARE @tbl TABLE
(
ID INT
)
INSERT INTO @tbl
SELECT TOP 100 ROW_NUMBER() OVER(Order by s.object_id)
from sys.objects s , sys.objects si
select LTRIM(SUBSTRING(@StartTime,ID,CHARINDEX(CHAR(13),@StartTime+CHAR(13),ID))) from @tbl
where CHARINDEX(CHAR(13),CHAR(13)+@StartTime,ID)=ID
Regards,
Mitesh OSwal
+918698619998
December 19, 2013 at 7:24 am
Everyone on this thread should take a look at the link in my signature for splitting strings. In there you will find a number of ways to split a string including the while loop and xml methods. As you continue reading you will find a tally table set based splitter. It will blow the doors off all the above methods for performance.
Another method that is super duper fast is Dwain Camp's pattern splitter. You can find it here. http://www.sqlservercentral.com/articles/String+Manipulation/94365/[/url]
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
December 19, 2013 at 7:33 am
Cheers Sean. Will have a look 🙂
December 19, 2013 at 7:58 am
Hi Sowbhari,
Thanks for reply but if i am using your approach, i am getting below error.
XML parsing: line 1, character 13, end tag does not match start tag
Thanks
Abhas
December 19, 2013 at 7:59 am
Hi,
Sorry Sowbhari,
its working.
DECLARE @StartTime VARCHAR(MAX),@X XML
SELECT @StartTime = '7:00 AM <br/> 12:30 PM'
SELECT @X = CONVERT(XML,'
' + replace(@StartTime,'<br/> ','</br>
') + '</br>')
SELECT x.i.value('.','varchar(max)') AS [StartTime]
FROM @X.nodes('//br') x(i)
Thanks.
December 19, 2013 at 8:01 am
abhas (12/19/2013)
Hi,Sorry Sowbhari,
its working.
DECLARE @StartTime VARCHAR(MAX),@X XML
SELECT @StartTime = '7:00 AM <br/> 12:30 PM'
SELECT @X = CONVERT(XML,'
' + replace(@StartTime,'<br/> ','</br>
') + '</br>')
SELECT x.i.value('.','varchar(max)') AS [StartTime]
FROM @X.nodes('//br') x(i)
Thanks.
I would again recommend you look at the articles I referenced. They will perform a LOT better in most situations. Try them out and see what works for you but the DelimitedSplit8K function is crazy fast.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
December 19, 2013 at 8:48 am
Sean Lange (12/19/2013)
abhas (12/19/2013)
Hi,Sorry Sowbhari,
its working.
DECLARE @StartTime VARCHAR(MAX),@X XML
SELECT @StartTime = '7:00 AM <br/> 12:30 PM'
SELECT @X = CONVERT(XML,'
' + replace(@StartTime,'<br/> ','</br>
') + '</br>')
SELECT x.i.value('.','varchar(max)') AS [StartTime]
FROM @X.nodes('//br') x(i)
Thanks.
I would again recommend you look at the articles I referenced. They will perform a LOT better in most situations. Try them out and see what works for you but the DelimitedSplit8K function is crazy fast.
+1 +1 +1 This function performs better than any other string splitting function I've seen and performance is very linear as you run it with larger data sets. It requires a bit of stretching at first, but once you "get it" it'll change the way you look at data. To test it out for yourself, run both solutions up to 1,000,000 rows and compare performance.
When you're done splitting things up, you may want to consider addressing the root cause - an application (or load process) putting two values in one column. Can the user enter 3 values? How about 17 values? Once the application is fixed, you can fix the data that's there.
December 19, 2013 at 5:18 pm
Sean Lange (12/19/2013)
Everyone on this thread should take a look at the link in my signature for splitting strings. In there you will find a number of ways to split a string including the while loop and xml methods. As you continue reading you will find a tally table set based splitter. It will blow the doors off all the above methods for performance.Another method that is super duper fast is Dwain Camp's pattern splitter. You can find it here. http://www.sqlservercentral.com/articles/String+Manipulation/94365/[/url]
I appreciate the vote of confidence Sean, but I think in this case I'd opt for something simpler like this:
WITH SampleData (StartTime) AS
(
SELECT '7:00 AM <br/> 12:30 PM'
UNION ALL SELECT '11:00 AM <br/> 2:30 PM'
)
SELECT StartTime, StartTimes
FROM SampleData a
CROSS APPLY
(
SELECT REPLACE(StartTime, ' <br/> ', ' ')
) b (st)
CROSS APPLY
(
VALUES(LEFT(st, CHARINDEX('M', st))),(STUFF(st, 1, CHARINDEX('M', st)+1, ''))
) c (StartTimes)
The CROSS APPLY VALUES approach to UNPIVOT is explained in the first article in my signature links.
BTW. Last name is "Camps" so possessive would be Camps' 😀
My thought question: Have you ever been told that your query runs too fast?
My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.
Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
Since random numbers are too important to be left to chance, let's generate some![/url]
Learn to understand recursive CTEs by example.[/url]
[url url=http://www.sqlservercentral.com/articles/St
December 20, 2013 at 7:24 am
Last name is "Camps" so possessive would be Camps' 😀
Doh!!! Sorry about getting your name wrong. :blush:
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
January 1, 2014 at 12:17 pm
Sean Lange (12/20/2013)
Last name is "Camps" so possessive would be Camps' 😀
Doh!!! Sorry about getting your name wrong. :blush:
Dwain must have been "camping out" just waiting for you to make that mistake. I know, I know... you've both had mo-den enough of those kinds of jokes. 😛
--Jeff Moden
Change is inevitable... Change for the better is not.
January 1, 2014 at 12:40 pm
Ed Wagner (12/19/2013)
+1 +1 +1 This function performs better than any other string splitting function I've seen and performance is very linear as you run it with larger data sets. It requires a bit of stretching at first, but once you "get it" it'll change the way you look at data. To test it out for yourself, run both solutions up to 1,000,000 rows and compare performance.
Doesn't matter how nice your are, Ed, it's still your turn to buy lunch. 😛
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 15 (of 16 total)
You must be logged in to reply to this topic. Login to reply