February 5, 2014 at 9:33 pm
Comments posted to this topic are about the item Sending a Variable Number of Elements in a Parameter
February 5, 2014 at 10:45 pm
Alex,
Unfortunately, I must offer two criticisms of this article. At the end you said:
Alex Grinberg (1/15/2014)
My preference is the User-Defined Table type. I found that this technique is better optimized than the other methods discussed in this article.
Normally, when you make a statement like that it is customary to provide some benchmarking information regarding specifically what you mean. Fortunately, in this case SQL MVP Erland Sommarskog has got your back in his definitive treatises on this concept:
http://www.sommarskog.se/arrays-in-sql.html
The second criticism is about the way that you've chosen to split your delimited list. Pretty well known around here is DelimitedSplit8K, championed by SQL MVP Jeff Moden in this article:
Tally OH! An Improved SQL 8K “CSV Splitter” Function[/url]
It is going to be orders of magnitude faster than the string splitter you've included.
My thought question: Have you ever been told that your query runs too fast?
My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.
Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
Since random numbers are too important to be left to chance, let's generate some![/url]
Learn to understand recursive CTEs by example.[/url]
[url url=http://www.sqlservercentral.com/articles/St
February 5, 2014 at 11:04 pm
Dwain you hit both things I was thinking myself.
To be fair there are two splitters posted. The first is the looping version. The second one is inside the usp_DelimitedString looks to be a complete "borrowing" of the DelimitedSplit8K function. Not sure exactly why the delimiter isn't passed in instead of being hardcoded in the proc.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
February 5, 2014 at 11:20 pm
Sean Lange (2/5/2014)
Dwain you hit both things I was thinking myself.To be fair there are two splitters posted. The first is the looping version. The second one is inside the usp_DelimitedString looks to be a complete "borrowing" of the DelimitedSplit8K function. Not sure exactly why the delimiter isn't passed in instead of being hardcoded in the proc.
Missed that. So now I'll have to add two more:
- There are probably sufficient reasons to say this "borrowing" was not direct plagiarism (e.g., it operates on a much shorter string and the delimiter is hard-coded) but an attribution to Jeff would probably have been good nonetheless.
- Why would you embed utility code in the SP when an external function as fast as DelimitedSplit8K is a much more effective approach, particularly from a reusability standpoint?
My apologies for being a tough audience today.
My thought question: Have you ever been told that your query runs too fast?
My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.
Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
Since random numbers are too important to be left to chance, let's generate some![/url]
Learn to understand recursive CTEs by example.[/url]
[url url=http://www.sqlservercentral.com/articles/St
February 6, 2014 at 12:30 am
Another option or variation on the solutions offered would be to parse the string using CLR ?
We have gone down this route before and, without digging up the benchmarking(sorry), we found it quicker on CPU.
Further if the list of variables were fixed, such as int (ie- 4byte), our data can be sent using the varbinary(max) datatype and split (again using CLR).
Just some ideas... 🙂
February 6, 2014 at 1:13 am
Damned if you do, damned if you don't!
I find the criticism somewhat harsh. Am I alone in thinking that articles don't have to be exhaustive? I can see both sides of the argument, however I often do like to read articles that don't have 10 pages of benchmarking etc.. I'd rather just hear the idea sometimes. I'm intelligent enough to know that it mightn't be the only idea, the best idea, or even sometimes a good idea.
Also, sorry but I find the allusion to plagiarism a little facile. This code is pretty much public domain by now, and while a heads up to Mr Moden would have been nice, I really don't feel that this is in the same ballpark as proper examples of the P word.
As someone that's followed SSC for a number of years, I think there's a danger of the articles becoming a private members club, where the uninitiated are put off from submitting their ideas. I know I haven't written for quite a while, in part for these reasons.
Go easy on me! And have a good day.
February 6, 2014 at 2:42 am
Many thanks to Alex for writing this.
I think the only confusion for me as a reader of this article was that the title didn't make clear enough whether it was about passing 'tables' as parameters to routines, or about variable numbers of parameters. The 'delimited string' option seemed to demonstrate the latter, and the other examples the former.
Best wishes,
Phil Factor
February 6, 2014 at 2:52 am
David McKinney (2/6/2014)
Damned if you do, damned if you don't!I find the criticism somewhat harsh. Am I alone in thinking that articles don't have to be exhaustive? I can see both sides of the argument, however I often do like to read articles that don't have 10 pages of benchmarking etc.. I'd rather just hear the idea sometimes. I'm intelligent enough to know that it mightn't be the only idea, the best idea, or even sometimes a good idea.
Also, sorry but I find the allusion to plagiarism a little facile. This code is pretty much public domain by now, and while a heads up to Mr Moden would have been nice, I really don't feel that this is in the same ballpark as proper examples of the P word.
As someone that's followed SSC for a number of years, I think there's a danger of the articles becoming a private members club, where the uninitiated are put off from submitting their ideas. I know I haven't written for quite a while, in part for these reasons.
Go easy on me! And have a good day.
I share this sentiment and will read the article in full at a later moment.
The solutions I will compare to what I have in the works myself and are based on CLR with some twists.
February 6, 2014 at 3:27 am
David McKinney (2/6/2014)
Damned if you do, damned if you don't!I find the criticism somewhat harsh. ...
Also, sorry but I find the allusion to plagiarism a little facile. This code is pretty much public domain by now, and while a heads up to Mr Moden would have been nice, I really don't feel that this is in the same ballpark as proper examples of the P word.
...
Well I did at least apologize for my harshness. Maybe I'm just having a bad day.
dwain.c (2/5/2014)
...My apologies for being a tough audience today.
I did say that I didn't consider it plagiarism also, although I understand how simply mentioning that harsh word may be interpreted as thinking it was.
I think the thing that fried my liver the most was that looping string splitter. Seeing that published again and again, when there are so much faster methods just rubs me the wrong way.
My thought question: Have you ever been told that your query runs too fast?
My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.
Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
Since random numbers are too important to be left to chance, let's generate some![/url]
Learn to understand recursive CTEs by example.[/url]
[url url=http://www.sqlservercentral.com/articles/St
February 6, 2014 at 4:20 am
Interesting, Very interesting... In addition to the comments above I'd also take issue with your xml implementation.
<Product>
<ProductID>1</ProductID>
<ProductName>Chai</ProductName>
<CompanyName>Lehmanns Marktstand</CompanyName>
<ShippersName>Speedy Express</ShippersName>
<OrderDate>12/07/2013</OrderDate>
<UnitPrice>18.0000</UnitPrice>
<Quantity>40</Quantity>
<Discount>0.15</Discount>
</Product>
XML also supports attributes, If you use attributes, then you XML will be half the string size,
e.g. <Product ProductID="1" ProductName="Chai" />
Also you should have a user defined function to escape the four XML sequences
also you should alias the Attribute Names
e.g. <Prod ID="1" PN="Chai" /> shorter again....
February 6, 2014 at 5:14 am
My preference is also for UDTTs. Unfortunately, I think Microsoft faked the implementation by creating a temp table in tempdb and passing that to the stored procedure resulting in a less than optimal solution. A good rule of thumb would be if you are doing a round trip for more than three rows, you will benefit from using a UDTT. This has been a performance problem for my projects for five years since clients insert thousands of rows in one shot. I was using XML since some clients were on ss2005.
In my opinion, Microsoft is getting their butt kicked by No-Sql and Graph databases. Microsoft always seem to be 10 years behind the curve (implementing OOP and a proper transactional ADO).
Thanks to the article author for braving this group. There is a clique in these forums that just can't seem to let it go when talking about string parsing. In my opinion they have ruined a good forum with all their petty griping and claiming there is only one right way to do things. I feel sorry for any new members who get bashed by these guys. Most newbies just want to learn the next step in their path. Teaching calculus to a second grader is absurd.
February 6, 2014 at 5:18 am
Hey, you can simply call the below table value function and you are done, give it a try 🙂
ALTER FUNCTION [dbo].[ConvertStringToTable]
(
@ArrayString nvarchar(1000),
@Seperator char(1) = ','
)
RETURNS @ResultedTable TABLE (Val NVARCHAR(100))
AS
BEGIN
INSERT @ResultedTable
SELECT
T2.item.value('(./text())[1]','varchar(10)') AS Val
FROM
(
select convert(xml,'<Table><Rec>'+replace(@ArrayString,@Seperator,'</Rec><Rec>')+'</Rec></Table>') as xmldoc
)as xmltable
CROSS APPLY xmltable.xmldoc.nodes('/Table/Rec') as T2(item);
RETURN;
END
February 6, 2014 at 5:54 am
On this article, my goal was to bring different varieties on techniques how to work with multiple values sent to a stored procedure. The final choice on the reader. I agree that loop is not the best choice for production environments. Personally. I am using "loop" split.
February 6, 2014 at 5:59 am
I have tested CLR to split a delimited string, much slower and take bigger resources. This why I did include CLR option in my posting.
February 6, 2014 at 6:23 am
Oh and I find the topic a bit misleading, as I don't think its possible to send a table var when calling a stored procedure from (say) an aspx page....
But it is a good topic
(2/6/2014)
Interesting, Very interesting... In addition to the comments above I'd also take issue with your xml implementation.<Product>
<ProductID>1</ProductID>
<ProductName>Chai</ProductName>
<CompanyName>Lehmanns Marktstand</CompanyName>
<ShippersName>Speedy Express</ShippersName>
<OrderDate>12/07/2013</OrderDate>
<UnitPrice>18.0000</UnitPrice>
<Quantity>40</Quantity>
<Discount>0.15</Discount>
</Product>
XML also supports attributes, If you use attributes, then you XML will be half the string size,
e.g. <Product ProductID="1" ProductName="Chai" />
Also you should have a user defined function to escape the four XML sequences
also you should alias the Attribute Names
e.g. <Prod ID="1" PN="Chai" /> shorter again....
Viewing 15 posts - 1 through 15 (of 54 total)
You must be logged in to reply to this topic. Login to reply