December 15, 2021 at 4:41 pm
Hi
How the below statement works -
AND Table1.ID2 = ISNULL(@Parameter3, Table1.ID2)
Thanks
December 15, 2021 at 6:05 pm
if @Parameter3 is null, then use table1.id2.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
December 15, 2021 at 6:31 pm
if @Parameter3 is null, then use table1.id2.
Which is typically a really poor performing construct.
I see this far too often when something else should have been used.
Michael L John
If you assassinate a DBA, would you pull a trigger?
To properly post on a forum:
http://www.sqlservercentral.com/articles/61537/
December 15, 2021 at 6:57 pm
Yeah, I wasn't saying it was "good", it is just what it was doing. I prefer the "WHERE (A=@a) OR (@a IS NULL)" approach myself... and depending on how often I expect @a to be null, I may do the @a is null check first.
As a general rule, I try to avoid functions in WHERE clauses or JOIN conditions. SOMETIMES you can't avoid it, but functions do not belong in a WHERE clause or a JOIN condition.
Also, Michael, I like your signature joke about pulling the trigger. Reminds me of the "A TSQL query walks into a NoSQL bar and leaves because there were no tables to join".
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
December 15, 2021 at 7:18 pm
Yeah, I wasn't saying it was "good", it is just what it was doing. I prefer the "WHERE (A=@a) OR (@a IS NULL)" approach myself... and depending on how often I expect @a to be null, I may do the @a is null check first.
That's just as poor of a construct.
Take one of your procs, clear your cache, and run it with and without the parameter being NULL. Then, use query store and see what kind of difference the execution may be.
Michael L John
If you assassinate a DBA, would you pull a trigger?
To properly post on a forum:
http://www.sqlservercentral.com/articles/61537/
December 15, 2021 at 7:37 pm
Wouldn't that be related to parameter sniffing in the scenario I gave vs function in a WHERE clause in the OP? And if so, the performance could tank due to poor initial parameters for the query, not specifically related to a NULL parameter?
I had thought that the problem with ISNULL is that functions in a WHERE clause are harder for the optimizer to determine what it should do than if they are more explicit and that it could result in a table scan where a table seek may suffice.
What is the preferred approach to this? I have this scenario come up frequently with SSRS reports where users have a dropdown and can either select 1 and they have an "ALL" option which passes NULL in as a parameter so we can do the "A=@a OR @a IS NULL" approach.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
December 15, 2021 at 7:51 pm
Wouldn't that be related to parameter sniffing in the scenario I gave vs function in a WHERE clause in the OP? And if so, the performance could tank due to poor initial parameters for the query, not specifically related to a NULL parameter?
I had thought that the problem with ISNULL is that functions in a WHERE clause are harder for the optimizer to determine what it should do than if they are more explicit and that it could result in a table scan where a table seek may suffice.
What is the preferred approach to this? I have this scenario come up frequently with SSRS reports where users have a dropdown and can either select 1 and they have an "ALL" option which passes NULL in as a parameter so we can do the "A=@a OR @a IS NULL" approach.
It's exactly parameter sniffing.
As for the preferred approach, it depends. If there are only a few possible combinations, create multiple procs that are optimized for each possibility. Or, use parameterized dynamic SQL. Or, something else!
Michael L John
If you assassinate a DBA, would you pull a trigger?
To properly post on a forum:
http://www.sqlservercentral.com/articles/61537/
December 15, 2021 at 10:14 pm
Mr. Brian Gale wrote:Wouldn't that be related to parameter sniffing in the scenario I gave vs function in a WHERE clause in the OP? And if so, the performance could tank due to poor initial parameters for the query, not specifically related to a NULL parameter?
I had thought that the problem with ISNULL is that functions in a WHERE clause are harder for the optimizer to determine what it should do than if they are more explicit and that it could result in a table scan where a table seek may suffice.
What is the preferred approach to this? I have this scenario come up frequently with SSRS reports where users have a dropdown and can either select 1 and they have an "ALL" option which passes NULL in as a parameter so we can do the "A=@a OR @a IS NULL" approach.
It's exactly parameter sniffing.
As for the preferred approach, it depends. If there are only a few possible combinations, create multiple procs that are optimized for each possibility. Or, use parameterized dynamic SQL. Or, something else!
Or - accept that you possibly get a less than optimal execution plan that works well enough for the report. I would not go down the path of dynamic SQL or multiple procedures for a report that is run a couple times a day against a read-only secondary and performs well enough.
If the report is going to be run more frequently - with a lot of variations to the parameters being passed, then I would evaluate the report further. Does this report need near real-time data - if it does then maybe a redesign of the report and expected parameters is a better approach. If it doesn't - then maybe a shared dataset can be used and update the report(s) to use filters instead of parameters, then schedule a refresh of the shared dataset as needed.
Many other options - before needing to go dynamic SQL or multiple procedures.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
December 16, 2021 at 1:51 pm
Michael L John wrote:Mr. Brian Gale wrote:Wouldn't that be related to parameter sniffing in the scenario I gave vs function in a WHERE clause in the OP? And if so, the performance could tank due to poor initial parameters for the query, not specifically related to a NULL parameter?
I had thought that the problem with ISNULL is that functions in a WHERE clause are harder for the optimizer to determine what it should do than if they are more explicit and that it could result in a table scan where a table seek may suffice.
What is the preferred approach to this? I have this scenario come up frequently with SSRS reports where users have a dropdown and can either select 1 and they have an "ALL" option which passes NULL in as a parameter so we can do the "A=@a OR @a IS NULL" approach.
It's exactly parameter sniffing.
As for the preferred approach, it depends. If there are only a few possible combinations, create multiple procs that are optimized for each possibility. Or, use parameterized dynamic SQL. Or, something else!
Or - accept that you possibly get a less than optimal execution plan that works well enough for the report. I would not go down the path of dynamic SQL or multiple procedures for a report that is run a couple times a day against a read-only secondary and performs well enough.
If the report is going to be run more frequently - with a lot of variations to the parameters being passed, then I would evaluate the report further. Does this report need near real-time data - if it does then maybe a redesign of the report and expected parameters is a better approach. If it doesn't - then maybe a shared dataset can be used and update the report(s) to use filters instead of parameters, then schedule a refresh of the shared dataset as needed.
Many other options - before needing to go dynamic SQL or multiple procedures.
Correct. The existing query may very well be good enough
Michael L John
If you assassinate a DBA, would you pull a trigger?
To properly post on a forum:
http://www.sqlservercentral.com/articles/61537/
December 16, 2021 at 6:19 pm
Jeffrey Williams wrote:Michael L John wrote:Mr. Brian Gale wrote:Wouldn't that be related to parameter sniffing in the scenario I gave vs function in a WHERE clause in the OP? And if so, the performance could tank due to poor initial parameters for the query, not specifically related to a NULL parameter?
I had thought that the problem with ISNULL is that functions in a WHERE clause are harder for the optimizer to determine what it should do than if they are more explicit and that it could result in a table scan where a table seek may suffice.
What is the preferred approach to this? I have this scenario come up frequently with SSRS reports where users have a dropdown and can either select 1 and they have an "ALL" option which passes NULL in as a parameter so we can do the "A=@a OR @a IS NULL" approach.
It's exactly parameter sniffing.
As for the preferred approach, it depends. If there are only a few possible combinations, create multiple procs that are optimized for each possibility. Or, use parameterized dynamic SQL. Or, something else!
Or - accept that you possibly get a less than optimal execution plan that works well enough for the report. I would not go down the path of dynamic SQL or multiple procedures for a report that is run a couple times a day against a read-only secondary and performs well enough.
If the report is going to be run more frequently - with a lot of variations to the parameters being passed, then I would evaluate the report further. Does this report need near real-time data - if it does then maybe a redesign of the report and expected parameters is a better approach. If it doesn't - then maybe a shared dataset can be used and update the report(s) to use filters instead of parameters, then schedule a refresh of the shared dataset as needed.
Many other options - before needing to go dynamic SQL or multiple procedures.
Correct. The existing query may very well be good enough
Except it's almost always not. It's better to constantly practice doing it the right way rather than trying to decide if "good enough" actually is.
This post is a near duplicate of another post this OP has made. Please see the following thread where I talk about the use of "Catch All" queries and how to do them properly.
https://www.sqlservercentral.com/forums/topic/query-144
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply