March 19, 2012 at 1:53 pm
Thank you all for your help during this upgrade, yikes!!!
I'm getting yet another query that is stalling out.
This query has ALWAYS worked. Last week we fixed our statistics and it ran. Now I'm just watching the elapsed time.
I don't know of anything that has changed, any ideas what I can/should do?
Select
Count(*)
From
dbo.tbl_TimeCard_Upload_PR
Left Join
OpenQuery(Alpha,'
Select
JCTMST.CCONO As Job_Co
, JCTMST.CDVNO As Job_Div
, JCTMST.CJBNO As Main_Job_No
, JCTMST.CSJNO As Sub_Job_No
, JCTMST.CJCDI As JCT_JCD
, LTrim(RTrim(JCTMST.CJCS1)) As JCD_Seg1
, LTrim(RTrim(JCTMST.CJCS2)) As JCD_Seg2
, LTrim(RTrim(JCTMST.CJCS3)) As JCD_Seg3
From JCTMST
Where (JCTMST.CCSTY = ''L'')
') As Job_Mst
On tbl_TimeCard_Upload_PR.Job_Co = Job_Mst.Job_Co
And tbl_TimeCard_Upload_PR.Job_Div = Job_Mst.Job_Div
And tbl_TimeCard_Upload_PR.Main_Job_No = Job_Mst.Main_Job_No
And IsNull(tbl_TimeCard_Upload_PR.Sub_Job_No, '') = LTrim(RTrim(IsNull(Job_Mst.Sub_Job_No, '')))
And LTrim(RTrim(IsNull(tbl_TimeCard_Upload_PR.JCD_Seg1, ''))) = LTrim(RTrim(IsNull(Job_Mst.JCD_Seg1, '')))
And LTrim(RTrim(IsNull(tbl_TimeCard_Upload_PR.JCD_Seg2, ''))) = LTrim(RTrim(IsNull(Job_Mst.JCD_Seg2, '')))
And LTrim(RTrim(IsNull(tbl_TimeCard_Upload_PR.JCD_Seg3, ''))) = LTrim(RTrim(IsNull(Job_Mst.JCD_Seg3, '')))
Where
(tbl_TimeCard_Upload_PR.WeekEnding = '3/15/2012')
And (tbl_TimeCard_Upload_PR.PR_Status <> 'U')
And (Job_Mst.JCT_JCD Is Null)
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'CU')
/* And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> ''EA'') Pay Type to show in list */
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'HL')
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'NC')
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'OC')
/* And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> ''OV'') Pay Type to show in list */
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'PD')
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'PP')
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'PS')
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'PT')
/* And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> ''RT'') Pay Type to show in list */
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'SO')
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'SR')
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'SU')
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'TO')
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'TR')
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'UP')
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'VA')
And (tbl_TimeCard_Upload_PR.Pay_Type_ID <> 'VU');
March 19, 2012 at 1:56 pm
Does updating stats again fix it?
If so, probably this: http://sqlinthewild.co.za/index.php/2011/03/22/statistics-row-estimations-and-the-ascending-date-column/
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 20, 2012 at 6:06 am
How many Pay_Type_ID are left? Because it might make more sense to put in the remaining values as = with a few OR clauses rather than that really long long list of greater or less than. Also, what about using a derived table to select those values and do an outer join against it and eliminate values.
However, if you're not regularly updating statistics, that might be the issue.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
March 20, 2012 at 7:23 am
Yes Gail...
Updating the stats worked and the query ran in just a few seconds. Sigh...
So my next question is should I / can I modify the Stored Procedure to update statistics prior to the query running.
I could use UPDATE STATISTICS
or execute sp_updatestats
Thoughts?
March 20, 2012 at 7:32 am
No, that would probably not be a particularly good thing to do. I'd recommend create a job that updates stats on that table with fullscan and schedule it every night.
See the blog post I referenced for cause.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 20, 2012 at 7:38 am
Oh I read your blog before replying. Nice job by the way.
Since we just upgraded I'm not sure now often (or if) the admin is running Update Statistics. Regardless I'll ask him to schedule it nightly which it appears will solve the problem.
I was just wondering if I could run it after inserting a large batch of new records. This actually happens weekly.
Again you have my deep thanks,
March 20, 2012 at 7:40 am
Ken at work (3/20/2012)
I was just wondering if I could run it after inserting a large batch of new records. This actually happens weekly.
Oh, if you're only adding data once a week, then just put an UPDATE STATISTICS for that table in whatever job/process is doing the insert, just after the insert completed and don't do any other stats update on that table.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 20, 2012 at 7:48 am
Grant,
There are quite a few pay types. However I've already started looking to reduce the Where clause.
I was also looking at a temp table for the OpenQuery...
Thank you,
March 21, 2012 at 1:44 pm
Ken,
You're on the right path if you select from the OPENQUERY into a temp table and use that from there on out. I found out long ago that doing a join to a linked server was not a good idea.
If the number of pay codes you want to include is less than the ones you want to exclude, then you could use an IN type construct:
WHERE PayCode IN ('AB', 'AC'...)
Likewise if the number of pay codes to exclude is less you could use:
WHERE PayCode NOT IN ('AB', 'AC'...)
I think it at least would be easier to read.
Todd Fifield
March 27, 2012 at 8:43 am
The on-going saga of upgrading to SQL2005, sigh...
Now the DBA is updating statistics nights. On the table (tbl_PR) that has a bulk import the Stored Procedure executes and update statistic on that table after the import. Life is good.
The table (tbl_FM) where the records originate from for the bulk import have records inserted thought out several days with one or two days of the week being the heaviest. Even though the Statistics are updated nightly, even within the day this table becomes unreadable (takes forever) especially when joined with other tables. Life hurts!
Can you update statistics on a table when it is possible that someone may be inserting one to several hundred records.
I find it difficult to accept that this table is unusable until the next day, which by the way, it works great in the morning and doesn't work in the late afternoon as more and more records are inserted.
Ideas?
March 27, 2012 at 8:49 am
I've heard of cases where stats get updated hourly.
p.s. Make sure your DBA is updating stats with fullscan at night, not running something like sp_updatestats.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 27, 2012 at 8:56 am
Thanks,
I may have to try that.
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply