August 9, 2010 at 9:50 pm
Comments posted to this topic are about the item Parsing Parameters in a Stored Procedure
August 9, 2010 at 11:57 pm
Nice coverage. Thank you for taking the time to spell out the steps and issues when working with charindex in this manner. We do something similar for handling delimited values stored directly in the db field. And yes, when pulling from a known lookup list, it's nice and simple to append your delimiter on both ends. Love these brute force, simplistic solutions.
August 10, 2010 at 12:22 am
This comes in very handy with Reporting Services (SSRS) when dealing with it's Multi-Valued parameters.
I have used a routine like this heavily for this very purpose for some time now. I have produced a function tailored for use with reporting services parameters but is relevant to any scenario where you are dealing with an array of id's passed in one parameter or field.
I can't take credit for the guts of the procedure as much of it was taken from articles like this. But I include it here because it takes this articles point further and provides a relatively elegant solution to a fairly ugly problem;
Create Function General.ufn_SplitIntegers
(@String varchar(Max))
Returns @Results Table (Value Integer Primary Key)
As
-- ------------------------------------------------------------------------------------------------
-- This function takes a string containing an array and splits the array out and returns it as a
-- table object. This is particularly useful for handling multi value parameters in Reporting
-- Services.
--
-- There are variants of this routine. This one is tuned to assume a delimiter of comma and that
-- all values passed are integers. Others allow for user defined delimiter and dealing with text values.
--
-- To use it code as follows;
--SELECT Value FROM General.ufn_SplitIntegers('3,23,45,2,6')
-- or
--SELECT Value FROM General.ufn_SplitIntegers(@Parameter)
-- ------------------------------------------------------------------------------------------------
Begin
Declare @index Int, @slice varchar(Max), @Delimiter char(1)
Select @index = 1, @Delimiter = ','
If @String Is Null Return
While @index != 0
Begin
-- Get the Index of the first occurence of the Split character
Select @index = CharIndex(@Delimiter, @String)
-- Now push everything to the left of it into the slice variable
If @index !=0
Select @slice = Left(@String, @index - 1)
Else
Select @slice = @String
-- Put the item into the results set
Insert Into @Results(Value) Values(@Slice)
-- Chop the item removed off the main string
Select @String = Right(@String, Len(@String) - @index)
-- Break out if we are done
If Len(@String) = 0 Break
End
Return
End
GO
August 10, 2010 at 12:28 am
Nice solution, I was actually looking for a better way to parse parameter values from reporting service (multi-valued) over to a procedure. And until now I only had the loop-function. This is a nice way and probably faster way to parse values.
Tx
August 10, 2010 at 12:44 am
The XML Version
declare @Param as xml
set @Param = '<I>1</I><I>2</I><I>3</I>'
select IntList.IntField.value('.', 'int')
from @Param.nodes('I') IntList(IntField)
August 10, 2010 at 2:43 am
Very simple and useful.
I was looking for something like this only, and it helps me a lot.
Thanks.:-)
August 10, 2010 at 2:56 am
The use of table-valued parameters in SQL Server 2008 removes the need for these comma separated parameters.
But if you need something like this you could use a CLR function or use something like this:
--select * from [System].[tfn_DelimStringToTable]('a,b,c,d,e',',')
CREATE FUNCTION [System].[tfn_DelimStringToTable](@str NVARCHAR(max),@delim CHAR(1))
RETURNS @table TABLE ([ID] INT IDENTITY(1,1),[Column] sysname NOT NULL)
WITH SCHEMABINDING,ENCRYPTION
AS
BEGIN
DECLARE @x XML
SET @x = '<t>' + REPLACE(@str,@delim, '</t><t>') + '</t>'
INSERT INTO @Table([Column])
SELECT x.i.value('.', 'NVARCHAR(MAX)') AS token
FROM @x.nodes ('//t') x(i)
RETURN
END
August 10, 2010 at 3:27 am
Nice article and I think better approach.
I have solved the same problem but in different way. Please have a look.
http://www.sqlservercentral.com/articles/Comma+seperated/67417/
Thanks,
Nizam
August 10, 2010 at 5:42 am
I wonder how this performs if the table you are searching is 20K+ rows. It seems that this might be faster only for really small tables such as states. Has anyone done any benchmarks?
August 10, 2010 at 5:59 am
I have been using an XML string to achieve the same results. Not sure that it is faster, but it is quite flexible as:
EXEC sp_xml_preparedocument @idoc OUTPUT, @XMLParameters
insert into #T
select *
FROM OPENXML (@idoc, '/OrderGroup/Item', 1)
WITH #T
EXEC sp_xml_removedocument @idoc
In this example #T is a large table with over 20 columns. @XMLParameters is varchar(max) to pass as many rows as necessary at once, which is and added bonus, because there is still no Table as parameter. (I am using varchar instead of XML because XML is more strict and we do not need this here.)
Comments on this approach are welcome.
Jerome
August 10, 2010 at 6:30 am
Nice article to provide help for parsing delimited values, but I like better XML version provided by some user in the comments which is simpler but maybe more resource consuming.
But my question is why this article has title :parsing stored procedure parameters if you just looking for specific switch value in single parameter? You are not looking for pair Name=Value, just for values. Could you provide example how you use values received from your parsing process? Because maybe all your work is not really required if you next do something based on your parsed values using for example CASE statement with fixed values and you have to do word search again later in he code.
Could you provide example of usage of parameter values received from your parse routine?
August 10, 2010 at 6:33 am
jcboyer-1091017 (8/10/2010)
I have been using an XML string to achieve the same results. Not sure that it is faster, but it is quite flexible as:EXEC sp_xml_preparedocument @idoc OUTPUT, @XMLParameters
insert into #T
select *
FROM OPENXML (@idoc, '/OrderGroup/Item', 1)
WITH #T
EXEC sp_xml_removedocument @idoc
In this example #T is a large table with over 20 columns. @XMLParameters is varchar(max) to pass as many rows as necessary at once, which is and added bonus, because there is still no Table as parameter. (I am using varchar instead of XML because XML is more strict and we do not need this here.)
Comments on this approach are welcome.
Jerome
Hi Jerome,
Can you provide an example of what the content of @XMLParameters would be for your code? I don't want to presume. Thanks.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 10, 2010 at 6:53 am
Didn't we have a MASSIVE thread here on SSC.com on string parsing, with amazing code samples and wonderful benchmarks??
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
August 10, 2010 at 6:53 am
I'm torn. The article is well written and the code examples are pertinent to the article. However...
It was a quick read I did through the article so I may have missed something, but I didn't see anywhere where non-validated "elements" in the input string were handled. I saw where an "alert" was created if there were no matches, but didn't see anywhere individual elements would produce an alert if the existed.
For all of you good folks using XML to split things... I suppose that XML either does something special for you or that your consider its performance to be good enough. Just be aware that there are faster methods than XML... sometimes a lot faster.
Last but not least, this reminds me of the old "LIKE" method which also failed to notify if a single element of a parameter could not be validated with the only difference being that the validation list is table driven instead of built into code.
I'll try to post some test code tonight to show all of what I believe can happen with the code from the article. I might even have time to show why I don't use XML splitter code.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 10, 2010 at 7:00 am
There is a cute solution which i borrowed from some articles about Tallys on this Web-Site.
Check this out: http://www.sqlservercentral.com/articles/T-SQL/63003/
Here is how i implemented:
1. Create a Tally table. This is simple table with one Column N, which contains (for instance) numbers from 1 to 11000.
--Create a Tally Table
--===== Create and populate the Tally table on the fly SELECT TOP 11000 --equates to more than 30 years of dates IDENTITY(INT,1,1) AS N INTO dbo.Tally FROM Master.dbo.SysColumns sc1, Master.dbo.SysColumns sc2
--===== Add a Primary Key to maximize performance ALTER TABLE dbo.Tally ADD CONSTRAINT PK_Tally_N PRIMARY KEY CLUSTERED (N) WITH FILLFACTOR = 100
--===== Allow the general public to use it GRANT SELECT ON dbo.Tally TO PUBLIC
2. Create a parsing function:
create function [dbo].[FUNC_TBL_TEXT_IN_COL]
(
@Numbersvarchar(max)
,@Delimiterchar
) returns table
as
returnSELECTSUBSTRING(@Delimiter+@Numbers+@Delimiter,N+1,CHARINDEX(@Delimiter,@Delimiter+@Numbers+@Delimiter,N+1)-N-1) as X
FROMdbo.Tally
WHEREN < LEN(@Delimiter+@Numbers+@Delimiter)
AND SUBSTRING(@Delimiter+@Numbers+@Delimiter,N,1) = @Delimiter
3. Run an example:
selectX
from[dbo].[FUNC_TBL_TEXT_IN_COL]
(
'Jones,Schneider,68,whatever',','
)
Result is:
X
----
Jones
Schneider
68
whatever
Viewing 15 posts - 1 through 15 (of 124 total)
You must be logged in to reply to this topic. Login to reply