August 2, 2006 at 2:17 pm
Is there anything new in SQL 2005 that can evaluate a variable to be used within the IN clause. I know it can be done by using dynamic SQL but this is not a viable option for what I am using it with.
For example, an application passes a list of ID's to a stored procedure of (1,4,7,10,14)
create procedure GetEmployeeList
@empList nvarchar(200)
AS
select employees.name
from employees
where employeeID IN (@empList)
Previously you could not do this unless the sproc built the query in a string and executed it within itself using EXEC.
Is there a way to do this that I am missing?
August 2, 2006 at 3:49 pm
This is rough but it is adapted from a technique I've been working on for an upcomming article. A common table expression is an excellent way to parse short strings as arrays.
You get to be one of my guinnea pigs. 😉
-------------------------------------------------------------------------
create procedure GetEmployeeList
@empList nvarchar(200)
AS
with emp_cte (emp_id, emp_left, depth) as (
select @empList, @empList, 0
union all
select left(emp_left, charindex(',', emp_left) - 1 ),
right(emp_left, len(emp_left) - charindex(',', emp_left) ),
depth + 1
from emp_cte
where emp_left like '%,%'
union all
select emp_left, null, depth + 1
from emp_cte
where emp_left not like '%,%'
)
select employees.name
from employees e
join emp_cte c
on e.employeeID = c.emp_id
August 3, 2006 at 7:41 am
I tried using CTE's initially but ran into problems when creating the anchor member. Because it is coming in as a varchar and the member definition is expecting ints for the column, it errors out on the conversion.
I ran your example and got the following error, which was my same result when I tried it:
declare @empList varchar(200)
set @empList = '1,3,6,8,9,12,14';
with emp_cte (emp_id, emp_left, depth) as (
select @empList, @empList, 0
union all
select left(emp_left, charindex(',', emp_left) - 1 ),
right(emp_left, len(emp_left) - charindex(',', emp_left) ),
depth + 1
from emp_cte
where emp_left like '%,%'
union all
select emp_left, null, depth + 1
from emp_cte
where emp_left not like '%,%'
)
select e.firstname
from employees e
join emp_cte c
on e.employeeID = c.emp_id
Result:
Msg 245, Level 16, State 1, Line 4
Conversion failed when converting the varchar value '1,3,6,8,9,12,14' to data type int.
August 3, 2006 at 12:10 pm
Change the last line to this:
on e.employeeID = convert(int, c.emp_id)
where depth > 0
August 3, 2006 at 5:31 pm
Maybe XML is what you need?
We did so in 2K with
1.sp_xml_prepare_document -
2. OPENXML
3. JOIN with resultset of OPENXML ( IN (SELECT a_field FROM OPENXML(....))
I propose that in Yukon it must be even better and easier.
August 3, 2006 at 7:17 pm
Aaron's fix works pretty well. I'm going to do some performance testing with CTE's versus the subquery I was using to see how well they match up. I'll post the results within the next week or so... stay tuned.
August 4, 2006 at 10:00 am
I've solved this by creating a function (named fnTabulate) that returns a table of the supplied integers. I use it more than anticipated...
----------------------------------------------------------------------------
declare @StringIDs varchar(50)
set @stringIDs = '1,2,3,4,5,6,7,8,9'
select t.ID
from Table1 t,
dbo.fnTabulate(@StringIDs) IDTable i
where t.ID = i.Data
----------------------------------------------------------------------------
CREATE function fnTabulate ( @String varchar(8000) )
returns @x table ( Data varchar(256) )
as begin
declare @Comma int,
@cData varchar(8000),
@StrLen int
select @String = ltrim(rtrim(@String)),
@String = case when @String = ''
then ''
else @String +
case when right(@String, 1) != ','
then ','
else ''
end
end,
@Comma = charindex( ',', @String, 0),
@StrLen = len(@String)
while @Comma > 0 begin
select @cData = left(@String, @Comma-1)
if @@Error != 0 goto NoError
select @String = right(@String, @StrLen - @Comma),
@Comma = charindex( ',', @String, 0),
@StrLen = len(@String)
if @@Error != 0 goto NoError
insert @x ( Data ) select @cData
if @@Error != 0 goto NoError
end
goto NoError
OnError:
delete @x
goto TheEnd
NoError:
goto TheEnd
TheEnd:
return
end
August 4, 2006 at 11:58 am
That's one of the reasons the article is on hold for now... I did some performance comparisons and found that in many cases it's actually better performing to use WHILE loop than a CTE to process those.
And I've gotten to the point where there's a knee-jerk reaction to using WHILE or CURSOR regarding performance...
August 4, 2006 at 12:12 pm
Thats what I was afraid of....ugh. It's a shame....some older DB's like Progress allow array datatypes and you can pass in an array of int's to be evaluated in a query and it was a real performance boost when querying groups of data. Trying to pass that into SQL is a performance hit....
I was hoping 2005 would resolve this but I guess my suggestion didn't make it into the feature list
August 5, 2006 at 3:48 am
Hahahah...
the reason I developed the technique for parsing string data like that is because I have an ODBC connection to a progress database. Progress is dumping the data down from the array as one huge nvarchar column. I need a way to process about 50,000 rows very quickly (with about 50 variable-length items in the array).
I'm sure you love it, but man... Progress has been a huge thorn in my side for the past few months.
August 7, 2006 at 5:32 am
I'm still thinking that passing XML-parameter is the best.
You have ability to pass not just array but a TABLE and even resultsets of several tables.
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply