May 29, 2006 at 6:40 pm
Ok, run this...
SELECT * FROM dbo.fnListSplit('a,b,c,d,e,f', ',')
-------------------------------------------------------------
CREATE FUNCTION dbo.fnListSplit
(
@List VARCHAR(8000),
@Delimiter VARCHAR(2)
)
RETURNS @Resultset TABLE (i SMALLINT IDENTITY(0, 1), x VARCHAR(8000))
AS
BEGIN
INSERT @Resultset
(
x
)
SELECT SUBSTRING(@Delimiter + @List + @Delimiter, w.i + 1, CHARINDEX(@Delimiter, @Delimiter + @List + @Delimiter, w.i + 1) - w.i - 1)
FROM (
SELECT v0.n + v1.n + v2.n + v3.n i
FROM (
SELECT 0 n UNION ALL SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3 UNION ALL SELECT 4 UNION ALL SELECT 5 UNION ALL SELECT 6 UNION ALL SELECT 7 UNION SELECT 8 UNION ALL SELECT 9 UNION ALL SELECT 10 UNION ALL SELECT 11 UNION ALL SELECT 12 UNION ALL SELECT 13 UNION ALL SELECT 14 UNION ALL SELECT 15
) v0,
(
SELECT 0 n UNION ALL SELECT 16 UNION ALL SELECT 32 UNION ALL SELECT 48 UNION ALL SELECT 64 UNION ALL SELECT 80 UNION ALL SELECT 96 UNION ALL SELECT 112 UNION SELECT 128 UNION ALL SELECT 144 UNION ALL SELECT 160 UNION ALL SELECT 176 UNION ALL SELECT 192 UNION ALL SELECT 208 UNION ALL SELECT 224 UNION ALL SELECT 240
) v1,
(
SELECT 0 n UNION ALL SELECT 256 UNION ALL SELECT 512 UNION ALL SELECT 768 UNION ALL SELECT 1024 UNION ALL SELECT 1280 UNION ALL SELECT 1536 UNION ALL SELECT 1792 UNION SELECT 2048 UNION ALL SELECT 2304 UNION ALL SELECT 2560 UNION ALL SELECT 2816 UNION ALL SELECT 3072 UNION ALL SELECT 3328 UNION ALL SELECT 3584 UNION ALL SELECT 3840
) v2,
(
SELECT 0 n UNION ALL SELECT 4096
) v3
) w
WHERE w.i = CHARINDEX(@Delimiter, @Delimiter + @List + @Delimiter, w.i) AND w.i < LEN(@Delimiter + @List)
ORDER BY w.i
RETURN
END
N 56°04'39.16"
E 12°55'05.25"
May 30, 2006 at 2:23 am
Hi Peter,
This is really great!!
I use lists as param to stored procs to insert data into tables often, this function will come handy, I have written couple of parsing functions, but I was impressed by this.
Brij
May 30, 2006 at 2:44 am
Thank you.
There are some ways to further optimize this solution. If you know you never will have more than 256 "params", and every param has maxlength of 50, shorten the function down to...
------------------------------------------------------------------------
CREATE FUNCTION dbo.fnListSplit
(
@List VARCHAR(8000),
@Delimiter VARCHAR(1)
)
RETURNS @Resultset TABLE (i SMALLINT IDENTITY(0, 1), x VARCHAR(50))
AS
BEGIN
INSERT @Resultset
(
x
)
SELECT SUBSTRING(@Delimiter + @List + @Delimiter, w.i + 1, CHARINDEX(@Delimiter, @Delimiter + @List + @Delimiter, w.i + 1) - w.i - 1)
FROM (
SELECT v0.n + v1.n i
FROM (
SELECT 0 n UNION ALL SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3 UNION ALL SELECT 4 UNION ALL SELECT 5 UNION ALL SELECT 6 UNION ALL SELECT 7 UNION SELECT 8 UNION ALL SELECT 9 UNION ALL SELECT 10 UNION ALL SELECT 11 UNION ALL SELECT 12 UNION ALL SELECT 13 UNION ALL SELECT 14 UNION ALL SELECT 15
) v0,
(
SELECT 0 n UNION ALL SELECT 16 UNION ALL SELECT 32 UNION ALL SELECT 48 UNION ALL SELECT 64 UNION ALL SELECT 80 UNION ALL SELECT 96 UNION ALL SELECT 112 UNION SELECT 128 UNION ALL SELECT 144 UNION ALL SELECT 160 UNION ALL SELECT 176 UNION ALL SELECT 192 UNION ALL SELECT 208 UNION ALL SELECT 224 UNION ALL SELECT 240
) v1
) w
WHERE w.i = CHARINDEX(@Delimiter, @Delimiter + @List + @Delimiter, w.i) AND w.i < LEN(@Delimiter + @List)
ORDER BY w.i
RETURN
END
------------------------------------------------------------------------
and call for example with
SELECT * FROM dbo.fnListSplit('a,10,john.doe#whitehouse.gov,john doe,green,president', ',')
I have done some speed tests. Having 50 "params" took less than 0.003 seconds to parse.
Having 1374 "params" took 0.093 seconds to parse (with v2 and v3 of course)!
N 56°04'39.16"
E 12°55'05.25"
May 30, 2006 at 4:45 am
Joe
What I meant was passing comma separated values in a parameter to insert those into multiple rows in a table, Where does this violate 1NF.
for eg:
Create Procedure myproc(@val1, @val2, @val3, @List1
----- 1001,12121,5485,'12,12,14,14,15'
)
AS
Declare @ID int
Insert into Tab1 (@val1, @Val2, @val3)
Select @ID = SCOPE_IDENTITY()
here I need to parse csv to insert that into separate rows with @ID into child table.
and yes I will add necessary error handling in the script
Thanks
Brij
May 30, 2006 at 4:59 am
I agree with you...
I normally use csv just for integer values, so that there is not much load or looking for special characters and all that.. csv is passed by the developer to the stored proc, as it is easy to pick that from a multiple LIST or grid.
Brij
May 30, 2006 at 6:03 pm
@csv varchar(8000), @delimiter varchar(10)
@delimiter = '|'
@sql varchar(8000)
@sql = 'select ''' + replace(replace(@csv, char(39), char(39)+char(39)), @delimiter,''' union all select ''') + ''''
(@sql)
I liked its simplicity, but you can't use dynamic SQL in a UDF, and there are space limitations in building the SQL string (though with SQL9's varchar(max), that won't be a problem!). In effect each delimiter has to be regarded as 20 characters long, and there are another 9 characters to be added on top, while the whole must be kept below 8000. You could just add validation to calculate this figure. A limit of 400 characters in the input string would cover all possibilities but is a bit small. A maximum of 200 20-char items would work, or 133 @ 40 chars, 320 @ 5 chars... You can do fancy things with splitting strings as they get too big, but not without complicating things so as to make the solution much less appealing. There is also the issue of extra space for doubled (embedded) single-quotes - which the code deals with but my space calculations haven't so far. This depends on whether these are legal characters in your input strings in the first place.
Tim Wilkinson
"If it doesn't work in practice, you're using the wrong theory"
- Immanuel Kant
May 30, 2006 at 6:41 pm
I have to add that all this talk of 1nf and not passing CSVs really only arises because of the shortcomings of TSQL - not being able to pass tables (or equivalent) as parameters means that perfectly valid arrays (or 1 column tables) have to be forced into scalar datatypes in order to be passed cleanly between code units.
Tim Wilkinson
"If it doesn't work in practice, you're using the wrong theory"
- Immanuel Kant
May 31, 2006 at 4:44 am
Here's some more ideas for splitting... http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=50648 (including the one from Peter above)
Peter - I think I've successfully deduced that you are Peso on SQLTeam
Joe - I do enjoy your extreme purism (in this area and others), and I agree with you in many respects. Keep up the good work
And here are a few other related links...
http://vyaskn.tripod.com/passing_arrays_to_stored_procedures.htm
http://www.sommarskog.se/arrays-in-sql.html
http://www.sql-server-performance.com/mm_list_random_values.asp
Ryan Randall
Solutions are easy. Understanding the problem, now, that's the hard part.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply