July 1, 2011 at 8:19 am
We have a developer who likes to have a single stored procedure return completely different result sets based on an @action parameter being passed. In general they look something like this:
if @action = 'A'
Begin
...
End
If @action = 'B'
Begin
...
End
My thought is that having separate procs for each allows for a better execution plan to be generated. Any thoughts/evidence on this?
Thanks,
Z
July 1, 2011 at 10:23 am
Rob Scholl (7/1/2011)
We have a developer who likes to have a single stored procedure return completely different result sets based on an @action parameter being passed. In general they look something like this:if @action = 'A'
Begin
...
End
If @action = 'B'
Begin
...
End
My thought is that having separate procs for each allows for a better execution plan to be generated. Any thoughts/evidence on this?
Thanks,
Z
Execution plans are generated at the statement level not at the proc level. However too much logic in a single proc can cause it to recompile more often than one would like depending on what its doing.
Regarding code management and cleanliness: from a logical standpoint I find it better to have more smaller procs that do specific tasks. I also like it when the output interface of a proc is the same regardless of the input parameters. They're easier to maintain and manage in terms of unit testing and version control...but that's just my preference.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
July 1, 2011 at 10:31 am
I second opc.three.
In the scenario you describe I might use a "master proc" with the general structure as shown.
Depending on the logic inside each IF block I'd either have it directly implemented or (if either complex or used in multiple places) extracted into a separate procedure.
I most probably would not have a separate sproc or function for a single (and simple) SELECT statement and neither would I add hundreds of lines of code in each IF block.
July 5, 2011 at 7:39 am
Thanks for the feedback. I'm of the same mind. Smaller single purpose procs are much easier to maintain and re-use.
July 6, 2011 at 6:17 am
While I think SQL Server is smart enough to deal with the execution plans for the different parts of the code, writing stored procedure logic this way isn't a great idea if you want to achieve any sort of black boxing or abstraction. The rule is pretty simple: any sort of function should do "one thing" (and that one thing should be directly related to the name of the function), and a procedure that returns a result set is in this respect a type of function.
If you want an excellent discussion on why this type of coding isn't ideal, I can heartily recommend reading Code Complete, by Steve McConnell.
July 6, 2011 at 7:08 am
Hello!
I commonly write stored procs that handle add/edits to a single table. If the ID value passed in is zero, then it adds the record to the table. If the ID value passed in is not zero, it edits the record in the table using the ID in the Where clause. I'd really appreciate any thoughts on whether this is a good practice, or that having a separate add proc and edit proc would be a better practice. It seems like the comments here indicate separate procs, but the actions are related to my mind... Thanks so much!
July 6, 2011 at 7:13 am
lbohm 63578 (7/6/2011)
Hello!I commonly write stored procs that handle add/edits to a single table. If the ID value passed in is zero, then it adds the record to the table. If the ID value passed in is not zero, it edits the record in the table using the ID in the Where clause. I'd really appreciate any thoughts on whether this is a good practice,
Check out the MERGE statement. This does exactly what you are trying to achieve in one statement.
July 6, 2011 at 7:14 am
Well, you could probably do both of those with one MERGE statement. Either way though, they are one logical thing (you add a record), and you always get the same result.
July 6, 2011 at 7:19 am
Ah yes. The merge statement was not available in SQL 2005, was it? I've seen references to it but not had a chance to use it. Thanks!
July 6, 2011 at 7:28 am
Can't remember whether it was introduced in SQL 2005 or 2008, sorry.
Now, if you really wanted something to think about... try thinking about being able to pass a set of data from your appllication to your stored procedure to write to the database. What's more, it could be a set of object data (which may contain relationships).
For example, if my application is dealing with a "sales order" as an object, to write that to the database, I pass the entire sales order to a single stored procedure. Internally the procedure writes out the required header and detail rows (either inserting or updating or deleting as necessary), but the interface between the appication code and the database code is at the object level, not the database table level.
Not impossible to do; the approach I came up with was to pass the object to the stored procedure as typed-XML.
Again, the procedure is doing one (and only one) logical thing (writing a sales order to the database tables.)
July 6, 2011 at 7:30 am
2008.
July 6, 2011 at 7:38 am
I've had the infamous "I haven't needed to do those kinds of object transactions" excuse - I just became an 'accidental DBA' moving from a low-key, low-needs environment (higher ed, low tech users) to the 'real world' 🙂 I'm inheriting an "Oh, it was just a pilot so we didn't need to do THAT...." server/database/application, and I'm trying to stuff as much information in my head as I can so I at least have an idea of what I don't know... (and that is vast.) Once I get a better handle on at least some things, I can make more recommendations about database optimization (even though it is already in production and the front-end application is written. Sigh.) I'm sure that object transactions would be something very useful... I'm sure I'll be posting many, many more questions on the forum. Many. 🙂 Thanks, everyone, for being here for those of us that aren't exactly sinking yet, but more like frantically paddling....
July 6, 2011 at 7:44 am
Bruce W Cassidy (7/6/2011)
Can't remember whether it was introduced in SQL 2005 or 2008, sorry.Now, if you really wanted something to think about... try thinking about being able to pass a set of data from your appllication to your stored procedure to write to the database. What's more, it could be a set of object data (which may contain relationships).
For example, if my application is dealing with a "sales order" as an object, to write that to the database, I pass the entire sales order to a single stored procedure. Internally the procedure writes out the required header and detail rows (either inserting or updating or deleting as necessary), but the interface between the appication code and the database code is at the object level, not the database table level.
Not impossible to do; the approach I came up with was to pass the object to the stored procedure as typed-XML.
Again, the procedure is doing one (and only one) logical thing (writing a sales order to the database tables.)
The preferred way (in terms of performance) to do this now is to use Table-Valued Parameters (introduced in SQL 2008) to pass a set of data to a stored procedure.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
July 6, 2011 at 7:53 am
starting to get slightly OT now...
but you would have to prepare the XML by hand in the calling application which is probably even more work than calling separate SPs. Unless you develop a generic XML generator that reflects the table structures and relationships and then generates the XML dataset from that. It would also need to work with arrays of multiple objects, some of which may have been added, deleted or modified and handle any CRUD errors that may get thrown up by the database.
Oh hold on ....
Thats LINQ and .NET 🙂
Obiron
July 6, 2011 at 7:54 am
opc.three (7/6/2011)The preferred way (in terms of performance) to do this now is to use Table-Valued Parameters (introduced in SQL 2008) to pass a set of data to a stored procedure.
Yes and no.
It's hard to represent a set of related tables inside one parameter, which (in my earlier example) means that the application would have to pass sales order headers as one table-valued parameter, and the sales order details as another table-valued parameter, which in turn means that the application has to know something of how the database is splitting up those two. Whereas I wanted an approach built around the object as a whole, so the database and application shared one definition for the object.
Using table-valued parameters is definitely better from a pure SQL performance point of view though. On the other hand, the overhead of pulling apart the XML wasn't huge, so I erred on the side of the better level for the interface.
Er... I should probably point out that I was also doing this design work back in SQL Server 2005 days.
Viewing 15 posts - 1 through 15 (of 18 total)
You must be logged in to reply to this topic. Login to reply