October 7, 2014 at 5:25 pm
TheSQLGuru (10/7/2014)
Try your XML method under stress and see how it holds up. Or with lots of rows.There are some very nice threads here on SSC.com about string parsing into tables. Search DelimitedSplit8K
The XML method should hold up nicely for such parameter passing IF it's passed as XML. If it's passed as CSV and someone tries to use an XML-based splitter on it, then I agree... performance is going to suck simply because of the conversion of delimiters to XML tags.
Of course, DelimitedSplit8K is my favorite method for doing splits but it was specifically designed with performance in mind for VARCHAR(8000) or less. If the passed string of parameters is longer than that, another method might have to be considered because just changing it to handle VARCHAR(MAX) will cause it to run at least 2x times slower because the BLOB datatypes don't like to be joined to.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 8, 2014 at 6:57 am
Try something new like with table value parameter more simple and more powerfull
or cast Xml type in an in line function
http://msdn.microsoft.com/en-us/library/bb675163(v=vs.110).aspx
http://www.brentozar.com/archive/2014/02/using-sql-servers-table-valued-parameters/
http://www.codeproject.com/Articles/39161/C-and-Table-Value-Parameters
For BI reports (SSRS) use : http://geekswithblogs.net/GruffCode/archive/2012/06/21/using-table-valued-parameters-with-sql-server-reporting-services.aspx
October 8, 2014 at 7:18 am
TheSQLGuru (10/7/2014)
Try your XML method under stress and see how it holds up. Or with lots of rows.There are some very nice threads here on SSC.com about string parsing into tables. Search DelimitedSplit8K
For a stored procedure that facilitates something like a BI dashboard or month-end report, where the upfront parameter parsing time is a very small percentage of the overall data processing, then the XML method should be fine.
However, you're right that the XML parsing method would not hold up in a high volume OLTP scenario. The original post described an issue with a stored procedure that is "called all the time", and he had reason to believe there was contention with tempdb and high cpu utilization. When I test the XML parsing version looking at statistics io and statistics time, it is building up a 12k page worktable and clearly requires more ms, cpu, and io to process than Jeff's DelimitedSplit8K method, so I agree that DelimitedSplit8K would be best in this case.
Actually, what I typically do for BI reporting type procedures is pass nothing but a @ReportID parameter. There is a primary Report_Run (ReportID, ReportStartDT, ReportEndDT, etc.) table with addtional tables like Report_ProductID (ReportID, ProductID) containing the actual runtime parameters in a normalized fashion.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
October 8, 2014 at 5:02 pm
So, back to my original topic. After getting a little bit of time today to do some testing of the various things that have been suggested, here's what I found for an identical 10 batch run of each proc:
Inline SQL:
CPU: 800
Reads: 28000
Time: 800
#Temp table:
CPU: 680
Reads: 38000
Time: 680
@Table variable:
CPU: 110
Reads: 32000
Time: 110
I also replaced the current parser with the delimited8k parser in the @table variable version to see if that would move the needle at all:
@Table variable (delimited8k):
CPU: 70
Reads: 27000
Time: 70
#Temp table (delimited8k):
CPU: 590
Reads: 28000
Time: 590
It definitely parsed the string faster, but didn't make as much of an impact as going from #table to @table. As you can see the table variable ended up giving the least cost, however like someone mentioned earlier it is only because QO estimates an incorrectly large number of rows coming out of the join of the two large tables on the other plans while the @table variable is assumed to have 1 row which leads to a better plan even when it parses in 1,000 IDs.
Anyway, thanks for turning me on to the delimited8k parser since it will definitely be of use in this and many other areas where we've been using a slower inch-worm parser.
Viewing 4 posts - 16 through 18 (of 18 total)
You must be logged in to reply to this topic. Login to reply