September 10, 2016 at 10:07 pm
I'm trying to optimize a slow query:
declare @sys_change_context UNIQUEIDENTIFIER = 'DC59A04B-114D-480F-A3AB-B69673A5EB09'
,@last_sync_version BIGINT = 42171184
,@changes int = 0
IF EXISTS ( SELECT 1 FROM UserPrefs
RIGHT OUTER JOIN CHANGETABLE(CHANGES dbo.UserPrefs, @last_sync_version /*42171184*/ ) AS ct ON [UserPrefs].[UserPrefSys] = ct.[UserPrefSys]
WHERE (ct.SYS_CHANGE_CONTEXT IS NULL OR ct.SYS_CHANGE_CONTEXT <> CONVERT(VARBINARY(128), @sys_change_context))
AND ct.SYS_CHANGE_OPERATION IN ( 'I', 'D' ) )
select @changes = 1
It normally takes 10 seconds. When testing, if I replace @last_sync_version variable with constant 42171184, it runs in 1 second.
This is because optimizer chooses different plans for variable vs. constant.
Since I cannot hard code variable in SP, another way is to add OPTION (RECOMPILE).
But query hints cannot be used within subqueries.
So I removed IF EXISTS :
select
top 1 *
--count(*)
from UserPrefs
RIGHT OUTER JOIN CHANGETABLE(CHANGES dbo.UserPrefs, @last_sync_version) AS ct ON [UserPrefs].[UserPrefSys] = ct.[UserPrefSys]
WHERE (ct.SYS_CHANGE_CONTEXT IS NULL OR ct.SYS_CHANGE_CONTEXT <> CONVERT(VARBINARY(128), @sys_change_context))
AND ct.SYS_CHANGE_OPERATION IN ( 'I', 'D' ) OPTION (RECOMPILE)
This is also runs fast in 1 second, both top 1 * and count(*).
But code requirement is :
- capture existence of records in @changes variable
- without returning any result sets in select statement (because it messes with calling application) , so I cannot select top 1 and then check @@rowcount later
But the moment I add @changes variable to be set :
select
top 1 @changes = 1
--@changes = count(*)
from UserPrefs
RIGHT OUTER JOIN CHANGETABLE(CHANGES dbo.UserPrefs, @last_sync_version) AS ct ON [UserPrefs].[UserPrefSys] = ct.[UserPrefSys]
WHERE (ct.SYS_CHANGE_CONTEXT IS NULL OR ct.SYS_CHANGE_CONTEXT <> CONVERT(VARBINARY(128), @sys_change_context))
AND ct.SYS_CHANGE_OPERATION IN ( 'I', 'D' ) OPTION (RECOMPILE)
query becomes slow again takes 10 seconds.
These change-tracking queries happen for every table in database, so total time adds up to 10 minutes for SP.
I have updated statistics, rebuilt indexes etc. but I don't know if internal change tracking tables derive benefit from it.
How do I solve this ?
September 10, 2016 at 10:42 pm
nevermind, just thought of workaround - I can insert record into table and then check rowcount
insert into #tmp select
top 1 1 FROM CHANGETABLE(CHANGES dbo.UserPrefs, @last_sync_version) AS ct
WHERE (ct.SYS_CHANGE_CONTEXT IS NULL OR ct.SYS_CHANGE_CONTEXT <> CONVERT(VARBINARY(128), @sys_change_context))
AND ct.SYS_CHANGE_OPERATION IN ( 'I', 'D' ) OPTION (RECOMPILE)
select @changes = @@rowcount
September 11, 2016 at 12:48 am
Hints apply to the whole query, so recompile, if you need it, would go at the end of the outer query. You can also consider OPTION(OPTIMIZE FOR ...)
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
September 11, 2016 at 7:39 am
I have a STRONG vote for OPTION (RECOMPILE). That way you always get the best chance for the "optimal" plan no matter what your input(s) are.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
September 12, 2016 at 4:02 am
I'd test it using a parameter, not a local variable. Most of the time, but clearly not always, parameter sniffing is a good thing and helps performance. It really depends on the distribution of the data within your statistics for the column you're filtering on. If that data is not fairly evenly distributed, then I agree with the others (of course) that RECOMPILE is the way to go, if you can afford to do that (RECOMPILE is not free).
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
September 12, 2016 at 7:17 am
sqld-_-ba (9/10/2016)
nevermind, just thought of workaround - I can insert record into table and then check rowcount
insert into #tmp select
top 1 1 FROM CHANGETABLE(CHANGES dbo.UserPrefs, @last_sync_version) AS ct
WHERE (ct.SYS_CHANGE_CONTEXT IS NULL OR ct.SYS_CHANGE_CONTEXT <> CONVERT(VARBINARY(128), @sys_change_context))
AND ct.SYS_CHANGE_OPERATION IN ( 'I', 'D' ) OPTION (RECOMPILE)
select @changes = @@rowcount
Try also
DECLARE @Rows INT
SELECT @Rows = COUNT(*) FROM CHANGETABLE(CHANGES ...
For fast, accurate and documented assistance in answering your questions, please read this article.
Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
September 12, 2016 at 2:36 pm
Another thought (that we hit in our shop) is if "Option (Recompile)" improves performance, your statistics may be out of date.
How frequently do you re-create statistics or just let it happen automagically?
If performance is slow on your development environment, you could try running the query as is then rebuild your statistics and run the query again and see if performance improves.
YMMV.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply