June 2, 2009 at 9:55 am
TheSQLGuru (6/2/2009)
Interesting. You also get a seek if you simply hard code values for coalesce (and I presume isnull) such as this example. It avoids having to hit the table for the pair of MAXs:select TransactionID
from [Production].[TransactionHistory]
where ProductID BETWEEN coalesce(@ProductID, 0) AND coalesce(@ProductID, 99999999) --should use actual limits of INT here!
You don't get the seek/bookmark lookup if you do select * with that query. Still a CI scan.
I've used this in the past after having read an article about it somewhere, I don't think it was on SSC. Of course the Select * will give a scan because you'd have to do a seek and a lookup with the Select * so the scan is probably faster.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
June 2, 2009 at 10:02 am
GilaMonster (6/2/2009)
TheSQLGuru (6/2/2009)
Interesting. You also get a seek if you simply hard code values for coalesce (and I presume isnull) such as this example. It avoids having to hit the table for the pair of MAXs:Yup, though as soon as you add a second condition, it goes back to a clustered index scan (at least for me). What do you get here? Index scan/Bookmark Lookup?
ALTER PROCEDURE Test2 ( @ProdID int = null, @Qty int = null)
AS
select TransactionID
from [Production].[TransactionHistory]
where ProductID BETWEEN coalesce(@ProdID, 0) AND coalesce(@ProdID, 99999999) --should use actual limits of INT here!
AND Quantity BETWEEN coalesce(@Qty, 0) AND coalesce(@Qty, 99999999)
GO
Exec Test2 @prodID = 790
Wouldn't the scan be expected with the additional criteria since Quantity is not part of nor included in the index?
If I make the index ProductID, Quantity or add Quantity as an included column I still get a seek.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
June 2, 2009 at 10:54 am
alter PROCEDURE Test4 ( @ProdID int = null, @Qty int = null)
AS
select TransactionID
from [Production].[TransactionHistory] -- with (index = [IX_TransactionHistory_ProductID])
where ProductID BETWEEN coalesce(@ProdID, 0) AND coalesce(@ProdID, 99999999) --should use actual limits of INT here!
AND Quantity BETWEEN coalesce(@Qty, 0) AND coalesce(@Qty, 99999999)
GO
Exec Test4 @ProdID = 790
4.16 cost with forced index, 11 IO
0.711 cost without forced index (CI scan), 792 IO
due to the mathematics of the optimizer (i.e. the MUCH higher cost associated with the known-to-be-not-sequential-io index seek/bookmark lookup the query plan cost of seeking/lookup 2 rows is MUCH higher than scaning the entire table despite significantly fewer total IOs.
Gail, I wonder if your larger table would still be more efficient doing the scan than with the forced seek?
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
June 2, 2009 at 10:58 am
Gail, I really don't want to argue this point - but I am having a small problem here. I have a query where I use this construct and include the with recompile option, and the actual execution plans for different criteria all use index seeks where appropriate.
I have tested this multiple times and still see index seeks being selected for each plan with different parameters.
Now, I think I understand what is happening and why it works on my system - but I really need to setup a test. I just have not had time yet.
Can you see what happens if you modify the parameterization option from simple to forced? Does this make any difference at all? If not, then I really don't understand why I can get index seeks on this procedure and very good performance.
BTW - the tables I am accessing are:
PatientAccessLog 51,867,860 rows
AuditLog 198,028,731 rows
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
June 2, 2009 at 11:38 am
Wouldn't the scan be expected with the additional criteria since Quantity is not part of nor included in the index?
If I make the index ProductID, Quantity or add Quantity as an included column I still get a seek.
Im guessing that the optimizer is ignoring the Quantity SARG's as @Quantity is NULL.
Cant prove that an the moment though
June 2, 2009 at 11:55 am
Jeffrey Williams (6/2/2009)
Gail, I really don't want to argue this point - but I am having a small problem here. I have a query where I use this construct and include the with recompile option, and the actual execution plans for different criteria all use index seeks where appropriate.I have tested this multiple times and still see index seeks being selected for each plan with different parameters.
Now, I think I understand what is happening and why it works on my system - but I really need to setup a test. I just have not had time yet.
Can you see what happens if you modify the parameterization option from simple to forced? Does this make any difference at all? If not, then I really don't understand why I can get index seeks on this procedure and very good performance.
BTW - the tables I am accessing are:
PatientAccessLog 51,867,860 rows
AuditLog 198,028,731 rows
Maybe with that many rows the optimizer is simply hoping it "gets lucky" and the seek/lookup plan is less costly that the massive cost of a table scan! :w00t:
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
June 2, 2009 at 1:22 pm
TheSQLGuru (6/2/2009)Maybe with that many rows the optimizer is simply hoping it "gets lucky" and the seek/lookup plan is less costly that the massive cost of a table scan! :w00t:
Maybe 🙂
I am still confused by this - because I am getting optimal plans for each case with optional criteria and I should not. At least, according to everything I have read on this thread and others - using this format I should not be getting an optimal plan.
I have also tested this by removing the check for an is null parameter - the plan is the same, uses the same indexes and performs the same for the different possibilities.
I am really wondering if setting the parameterization to forced is why this works. I really need to setup a test database and check this out.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
June 2, 2009 at 2:03 pm
1) I am curious what among these listed reasons where why you chose to do forced parameterization for your database. See here in BOL:
ms-help://MS.SQLCC.v9/MS.SQLSVR.v9.en/udb9/html/068282b7-c419-4e2c-990a-1f8b7c12762f.htm
I do have a client with a metric butt-ton of really horrible looping ADOc code and we picked up 30-35% throughput gains with forced parameterization.
2) doesn't make a difference on my copy (sql 2005 sp2):
use master
go
alter database AdventureWorks set PARAMETERIZATION FORCED
go
use adventureworks
go
dbcc freeproccache
go
exec test2 @productid = 790
use master
go
alter database AdventureWorks set PARAMETERIZATION SIMPLE
go
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
June 2, 2009 at 2:16 pm
TheSQLGuru (6/2/2009)
1) I am curious what among these listed reasons where why you chose to do forced parameterization for your database. See here in BOL:ms-help://MS.SQLCC.v9/MS.SQLSVR.v9.en/udb9/html/068282b7-c419-4e2c-990a-1f8b7c12762f.htm
I do have a client with a metric butt-ton of really horrible looping ADOc code and we picked up 30-35% throughput gains with forced parameterization.
2) doesn't make a difference on my copy (sql 2005 sp2):
use master
go
alter database AdventureWorks set PARAMETERIZATION FORCED
go
use adventureworks
go
dbcc freeproccache
go
exec test2 @productid = 790
use master
go
alter database AdventureWorks set PARAMETERIZATION SIMPLE
go
The vendor that supplies the application that uses this system recommended setting this parameter. In their testing they found a significant improvement for their application.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
June 2, 2009 at 3:09 pm
The vendor that supplies the application that uses this system recommended setting this parameter. In their testing they found a significant improvement for their application.
Perhaps because they found that their poorly written code (IS NULL OR constructs) required it to perform acceptably?!? :hehe:
Still not sure about that being the issue though. Can you restore a backup of your database (probably pretty darn big given the table sizes you listed) and then set forced parameterization off and see if you get the same query plans?? Or maybe just change production for a few minutes off hours (if doable obviously)?
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
June 2, 2009 at 3:54 pm
TheSQLGuru (6/2/2009)Perhaps because they found that their poorly written code (IS NULL OR constructs) required it to perform acceptably?!? :hehe:
Still not sure about that being the issue though. Can you restore a backup of your database (probably pretty darn big given the table sizes you listed) and then set forced parameterization off and see if you get the same query plans?? Or maybe just change production for a few minutes off hours (if doable obviously)?
More likely just the way they call the procedures - or, it could be the fact that they have a lot of cursors in their code 🙂
As for restoring a copy - not going to be possible at this time. I don't have the storage to restore another copy of a 350GB database, which is also copied for the mirror and also a copy available for the test system. At least, not at the moment.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
June 2, 2009 at 4:43 pm
Jeffrey Williams (6/2/2009)
Gail, I really don't want to argue this point - but I am having a small problem here. I have a query where I use this construct and include the with recompile option, and the actual execution plans for different criteria all use index seeks where appropriate.
SQL 2008?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
June 2, 2009 at 5:35 pm
Nope - SQL Server 2005 SP3 and had the same experience on 2005 SP2 CU7.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
June 2, 2009 at 6:11 pm
What if the age is not null, is empty ? I think this is kind of problem..
June 3, 2009 at 12:38 am
Jeffrey Williams (6/2/2009)
Nope - SQL Server 2005 SP3 and had the same experience on 2005 SP2 CU7.
Odd. I've played around with RECOMPILE on these before and on 2005 I always get 'average' plans. Might be the table size. I don't test on anything under 500 000 rows
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
Viewing 15 posts - 61 through 75 (of 83 total)
You must be logged in to reply to this topic. Login to reply