February 8, 2006 at 4:56 pm
Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/sholliday/thezerotonparameterproblem.asp
February 24, 2006 at 2:48 pm
Just to follow up, if you do a google search for:
"How to Optimize the Use of the OR Clause When Used with Parameters"
This guy talks about the OR issue....which is related to the Zero to N solution I propose.
Again, it is not a "cure all", it has to be weighed for less complexity vs lost performance.
..
February 28, 2006 at 4:05 am
Looks cool, definitey useful in many report scenarios.
I just 1 question...
Is this approach SQL injection safe?
I did not see any issues with your specific query at first sight, but I get a little worried when I see untyped parameters being passed, or dynamic queries being generated. I did not have enough time to make a thorough analysis here, I assume you could answer that, so it is easier for me to ask than spend the time
If not, a little warning might be good to accompany the article.
Thank you.
Duray AKAR
February 28, 2006 at 5:19 am
I see this as being a LOT more work to implement, test and debug than simply accepting a number of parameters that have default values for them? The XML allows you to specify a number of orderIDs - again, accept a varchar(8000) param as a CSV list of order IDs and use a simple UDF to split them into a table against which you can join... The default values for all parameters can be NULL - if it is NULL then it is ignored (by appropriate code in the where clause). Apart from that, I tend to have followed the same approach for many of the filtering screens in our application.
An optimisation can be made if you know that certain parameters are likely to appear by coding different branches of code in the stored proc so that the where clauses in the statements can make use of indices (no OR statements on the parameters that are always used in a particular branch).
I haven't done any performance comparison of CSV string splitting VS XML, but I imagine the overhead of invoking the XML processor, as well as the annoyance of building the XML client-side (or even worse, testing in Query Analyser) means I'll be sticking to CSV lists (where needed) and default values of NULL for parameters unless convinced otherwise...
I will be happily proven wrong - so long as it's done nicely
February 28, 2006 at 6:31 am
I don't see this technique as an improvement where people are already using the CSV string technique. By the time you have serialized an arbitrarily large number of unrelated parameters in strings, the XML would bring to the table a standard format. (after all, XML is essentially a string solution too)
Since table variables cannot be passed between stored procedures, the XML document is a good way to serialize a resultset to use as input to another another procedure. When simpler strategies fail, this technique may be one more option. It is a good one for the SQL bag of tricks.
February 28, 2006 at 7:13 am
After years of trying I prefer to perform a loop in my application to pass the data. It is safer, syntax issues are fewer and it performs at about the same speed. The big gain is that I can decide how to handle each and every error independantly without all the extra overhead. Programming laguages such as C++,C#,VB and all have better methods for sperating a data set than you can get inside SQL code. And if you are concenred with treating as a batch transaction versus independant transactions just execute "BEGIN TRANSACTION" over the connection first and end with COMMIT or ROLLBACK as you need. As well you should always keep your code simple where it is really a lot of work to trouble shoot sometimes the simplest of issues with this method and you do open yourself to injection attacks.
February 28, 2006 at 9:54 am
March 1, 2006 at 12:44 pm
Just to make something clear:
if the type is always the sam SQL Server 2000 can handle 2100
And to be honest, if you need more than that there are probably bigger issues with you design 😉
if it is all abou list processing then csv are way nicer (in my opinion)
The only case I have used something like this is when multiple (master/detail) records needed to be transfered in on go. But for reporting I think this is not the case.
Just my $0.02
* Noel
February 28, 2007 at 10:52 am
That is definitely a cool way to skin the cat. I have been fascinated by XML since the early days and still am.
There have been plenty of times I have wanted to use a single proc to pull back the same columns but with different criteria. The following is a method I came up with a couple of years ago - has been quite handy. I actually almost had it published in Sql Svr mag but the editor had concerns over performance (so they decided not to use it), so use it w/ a grain of salt. In my personal experience, performance has not been an issue and the gains from not using a bunch of "IF" statements, dynamic SQL, or maintaining multiple procs that return the same results has been a winner in my book.
CREATE PROC ProcName
@Param1 varchar(50) = null,
@Param2 int = null
AS
SELECT field1, field2
FROM tablename
WHERE field1 = isnull(@Param1, field1)
AND field2 = isnull(@Param2, field2)
Summary: If you pass the parameter, it's used for the comparison. If not, the field is compared against itself. Very simple, very easy to use.
February 28, 2007 at 10:21 pm
It appears to me a bit of too much work for SQL server. OpenXML has a large overhead associated with it. Beside the code appears fairly difficult for maintenance.
March 20, 2007 at 12:03 pm
Does this work with multiple filters? For example Country = "USA" & Order Date After "1/1/2006". Got an example?
March 20, 2007 at 2:16 pm
BigSam,
Samples are provided (at the bottom) of the article.
Here is your specific one.
print 'Filter on specific Countries and OrderDate'
EXEC uspOrderDetailsGetByXmlParams '
<ParametersDS>
<CustomerCountry>
<CountryName>USA</CountryName>
</CustomerCountry>
<SingleValueParam>
<OrderDateAfter>1/1/2006</OrderDateAfter>
</SingleValueParam>
</ParametersDS>
'
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply