July 29, 2012 at 4:38 pm
I have a pile of stored procedures that execute various SELECT statements based on parameters supplied by an application. Here's a short example, but some of them have many fields and joins of many tables:
ALTER PROCEDURE [RychlyFiltr].[spCoreCount]
@LtrFiltr varchar(100) = '',
@Skip1E char(1) = ''
with recompile, execute as owner
AS
BEGIN
declare @FullSelect nvarchar(4000)
SET NOCOUNT ON
/* Kde je @LtrFiltr, používá se, jak je. 1E se řeší stejně jako ostatní písmena. Když je @LtrFiltr prázdný, podíváme se @Skip1E abychom zjistili, jestli máme ukázat čí skrýt 1E záznamy. */
if LEN(@LtrFiltr) > 0
SELECT count(1)
FROM (select AkcesAutoID, GenusAutoID, SpeciesAutoID, Lokalita from Podrobnosti inner join (select * from dbo.fntStringsToTable (@LtrFiltr)) L On Podrobnosti.EvidenceLetter = L.EvidenceLetter) P
WHERE (P.GenusAutoID <> 0 or P.SpeciesAutoID <> 0 or len(P.Lokalita) <> 0)
else
if @Skip1E = 'T'
SELECT count(1)
FROM Podrobnosti P
WHEREP.EvidenceLetter <> '1E'
AND(P.GenusAutoID <> 0 or P.SpeciesAutoID <> 0 or len(P.Lokalita) <> 0)
else
SELECT count(1)
FROM Podrobnosti P
WHERE (P.GenusAutoID <> 0 or P.SpeciesAutoID <> 0 or len(P.Lokalita) <> 0) END
Would it be better to code the separate possibilities as different stored procedures, force the application to call different procedures and bail on the parameters? Some parameters would have to stay, in this case the @LtrFiltr, since the possibilities there are numerous, but I could have three separate procedures to replace the one here: one where @LtrFiltr contains letters, a second where @LtrFiltr is empty and @Skip1E equals 'T' and a third where neither of those is true. The number of procedures would climb quite a bit, but I could remove the 'Recompile' directive, allowing SS to cache the query plans, and eliminate some parameters. This particular query executes pretty much instantly, but some are a little more work for the server. Is there any point in redoing this, or is the improvement from separate procedures likely to be not significant enough to be worth the trouble?
July 30, 2012 at 2:49 pm
I wouldn't start splitting out your functionality into separate sprocs in the hope it will improve performance. Spend the time to understand why the slow ones are slow, and address those as nec.
As to the question of splitting out into separate sprocs, keep in mind the possible maintenance efforts/costs:
- With multiple sprocs, if some functionality common to all needs to be changed, can we keep track of the multiple sprocs to ensure changes are sufficient and complete?
- If we want to keep all functionality in one sproc, is it written/documented well enough to allow us to understand it a year down the road?
I realize this doesn't necessarily answer your question, but these are some of the issues I'd consider before re-engineering.
-MarkO
-MarkO
"You do not really understand something until you can explain it to your grandmother" - Albert Einstein
July 30, 2012 at 3:17 pm
myheadhurts (7/30/2012)
I wouldn't start splitting out your functionality into separate sprocs in the hope it will improve performance. Spend the time to understand why the slow ones are slow, and address those as nec.
Thank you, Mark, I have considered the points you mention. I know why some of the queries are slow, they are selecting lots of stuff from many joined tables. It's a habit I'm slowly trying to get my users away from - requesting a large dataset and scrolling through it. But I have to do it slowly - if I change too many things at once, they get tense, and being on good terms with everyone here is quite valuable to me. My question was, in general, is it worth trying to split out such things.
I had already sort of figured that it was probably not time well spent, thanks for confirming it. The re-use issue is already plaguing me, even with the number of procedures I currently have - I really wasn't looking for a reason to make it worse, unless there was some compelling performance consideration.
I thought the recompile business might be a factor, but in my own testing, slow queries remain slow, even when they are isolated and not recompiled, and I didn't notice any difference when they were contained within a recompiled, parameter-driven procedure. However, my testing was not exhaustive and I'm still new enough at SS to not always be sure of my results.
So, I'll leave my parameters in place and concentrate on classic performance tuning. Appreciate the thoughts.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply