July 6, 2011 at 7:58 am
aaron.reese (7/6/2011)
starting to get slightly OT now...
Yeah, sorry.
but you would have to prepare the XML by hand in the calling application which is probably even more work than calling separate SPs. Unless you develop a generic XML generator that reflects the table structures and relationships and then generates the XML dataset from that. It would also need to work with arrays of multiple objects, some of which may have been added, deleted or modified and handle any CRUD errors that may get thrown up by the database.
Oh hold on ....
Thats LINQ and .NET 🙂
Obiron
We didn't have LINQ at the time, but we did go down the route of a generic XML generator. I didn't do that bit though. And yes, the original design looked at single objects (such as a single sales order) but we did discuss extending it to handle sets of objects (so a list of sales orders.) Validation was done prior to posting to the database, so the database just had to send back a "completed okay" or a "failed with error..." for the entire set.
July 6, 2011 at 2:42 pm
Bruce W Cassidy (7/6/2011)
opc.three (7/6/2011)The preferred way (in terms of performance) to do this now is to use Table-Valued Parameters (introduced in SQL 2008) to pass a set of data to a stored procedure.
Yes and no.
It's hard to represent a set of related tables inside one parameter, which (in my earlier example) means that the application would have to pass sales order headers as one table-valued parameter, and the sales order details as another table-valued parameter, which in turn means that the application has to know something of how the database is splitting up those two. Whereas I wanted an approach built around the object as a whole, so the database and application shared one definition for the object.
It could be argued that the app is tightly coupled to the DB using either XML or TVP method. You have to pick your poison and weigh the degree of coupling with each into the overall pros and cons.
Using table-valued parameters is definitely better from a pure SQL performance point of view though. On the other hand, the overhead of pulling apart the XML wasn't huge, so I erred on the side of the better level for the interface.
You're right. I don't have the article handy but IIRC the performance benefits of TVPs over XML are definitely not worth a system rewrite, but for new development (on 2008+ of course) TVPs are the way I would go.
Er... I should probably point out that I was also doing this design work back in SQL Server 2005 days.
No bashing coming from me. I wasn't there. Even in 2008+ XML may make sense depending on the system requirements. Last I checked there still is no JDBC driver that supports TVPs. In 2005 XML was the preferred method over delimited lists, lists of delimited lists, pre-built/populated temp tables :sick:, etc.
And yes, I know we're off topic...but some of the best exchanges happen that way 😀
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
July 6, 2011 at 2:48 pm
Commenting on an earlier post...
Execution plans are generated for the entire procedure first time it runs based on the parameter values on that execution.
http://sqlinthewild.co.za/index.php/2009/09/15/multiple-execution-paths/
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
July 6, 2011 at 3:11 pm
Hey GilaMonster - thanks so much for that article link. It's exactly what I was looking for 🙂
- Z
Viewing 4 posts - 16 through 18 (of 18 total)
You must be logged in to reply to this topic. Login to reply