March 9, 2011 at 9:40 am
Hi Everyone.
Here's the details.
Server: SQL 2005, Win 2003, 32 GB RAM, Max Server Memory set to 20 GB. My databases and T-Logs are each stored on their own dedicated drives. The OS has pagefiles on the system drive and another drive but not on the drives where I keep DB's and T-Logs.
The procedure is fairly simple, it loads data into a work table, executes a series of updates in it from queries/sub-queries and then uses that table to update the permanent table.
I used a table variable as my work table, a simple "Declare @Temptable Table" with 25 fields. That procedure has been ran twice a day for months and never took more than 50 seconds................until last week.
At 12:30 Thurday it took it's usual 45 seconds then a 7:30 Friday morning it took over 90 minutes, 1.5 hours.
While trying top diagnose the problem, I changed from a table variable to a temp table because I wanted to run some queries on it but thenm it started taking its usual 45-50 seconds.
I have tested this numerous times in the last 4 days and it's always the same, table variable takes 90 minutes, temp table takes 45 seconds.
Now I'm left trying to understand why, obviously something happened on my server between 12:30 Thurday and 7:30 Friday but there's nothing in the logs or in the event viewer that would explain this.
Every else runs perfectly, I haven't seen anything else running slower than usual, no user has complained of any issue, our RAID controller hasn't logged any errors and there doesn't seem to be anything wrong with the RAM.
I don't know where to look, I'm a self-taught DBA, how don't know how to check/diagnose SQL memory usage but I do know it's something to do with memory and/or pagefiles.
One last thought, it can't be a data related issue because I have the same problem with my development database.
I'm in the dark.
March 9, 2011 at 9:51 am
This sounds like it could be a stale statistics problem. Have you run an update stats on your database?
March 9, 2011 at 9:56 am
March 9, 2011 at 10:11 am
rjohal above is correct, it's most likely got to do with the statistics difference between temp tables and table variables affecting the query plan. Temp tables store more accurate statistics for compilations. This would be my first guess without looking over the execution plans.
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
March 9, 2011 at 10:36 am
First, thank you all for helping, I knew I was asking at the right place 🙂
I'm a self-taught DBA and tbh, I really don't know much about statistics, I never did anything with and/or about them until now.
My understanding however is that these are reset whenever SQL Server starts up, am I right ? Because I forgot to mention in my post that I rebooted the server and the problem remains.
I also don't quite understand what you mean about stale statistics, according to this technet article, table variables don't have statistics. Do you mean that I should update the statistics on the tables that I get my data from ?
March 9, 2011 at 10:41 am
Gagne (3/9/2011)
My understanding however is that these are reset whenever SQL Server starts up, am I right ?
Nope
I also don't quite understand what you mean about stale statistics, according to this technet article, table variables don't have statistics. Do you mean that I should update the statistics on the tables that I get my data from ?
Correct, table variable have no statistics and that could well be part of the problem. If the row count is increaseing, you could easilyn have reached a tipping point. The fact that temp tables work fine implies that may very well be the case.
Update stats (with full scan) on the real tables and consider switching to a temp table
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 9, 2011 at 10:47 am
GilaMonster (3/9/2011)
Update stats (with full scan) on the real tables and consider switching to a temp table
I've already changed the procedure to use temp tables, that is no longer a concern. What bothers me is that something happened or changed in my server and I won't be satisfied until I understand what it is because it could create other problems that I just haven't seen yet.
As for the number of rows, I thought of that but couldn't see a large increase of rows in those table so I tried in my Development table which hasn't changed in over a month and it does the same thing.
I'll try updating the statistics on the tables that are used in this proc to see if that's where the problem lies.
March 9, 2011 at 11:34 am
update statistics with fullscan on all the table from which I get data didn't make a difference.
I ran my code in small pieces to see if I could find where it's slow..........it is everywhere 🙂
This is to prepare my table variable.
Declare@Period char(6)
Declare@Style char(10)
set @Period = ''
set @Style = ''
Declare @Temptable Table
(
Warehouse char(5),
Styletype char(5),
rmtype char(5),
InvOK char(1),
Style char(15),
Label char(5),
Color char(5),
Dimension char(5),
Prepack char(5),
StyleName char(30),
StyleDesc char(100),
Opening numeric(18,5),
RECV numeric(18,5),
RTRN numeric(18,5),
SA numeric(18,5),
SHIP numeric(18,5),
RCADJ numeric(18,5),
TRIN numeric(18,5),
TROUT numeric(18,5),
OTHER numeric(18,5),
Closing numeric(18,5),
Currency char(3),
StandardCost money,
ClosingCost Money,
costpkey int
)
if @Period = ''
select @period = dba.dbo.getfiscalperiod(getdate()-1)
Declare @Warehouses Table
(
Warehouse char(5),
Style char(15)
)
insertinto
@Warehouses
selectdistinct
warehouse, style
from invtransactions
wherestyle = @Style or @Style = ''
declare @Date datetime
select@Date = max(Caldate)
fromdba.dbo.fiscalcalendar
whereperiod = @Period
insert into
@Temptable
Selectdistinct
w.warehouse, s.StyleType, s.RMType, s.InventoryOKFlag,
a.Style, a.Label, a.Color, a.Dimension, a.prepack,
s.stylename, s.styledesc,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, '', 0.0, 0.0,
dbo.resolveitemeffcostpkey('00', a.style, a.label, a.color, a.dimension,a.prepack, '01', @Date)
fromStyleItem a
joinStyle s on s.style = a.style
join@Warehouses w on w.style = a.style
wheres.style = @Style or @Style = ''
That part of the code takes just about the same amount of time whether I use temp tables or table variables. It's about 45 seconds and I think it's to be expected since I'm forcing table scans with my where clauses and I also call a function in my select statement.
Then I execure 8 update statements like the following, the only difference between them is the field they update and the invtrxtype in the where clause. With the temp table as show below, it's almost instantaneous, with a table variable, it use to be the same but now it takes about 12 minutes for each upadte.
What could cause this statement to jump from less than 1 second to over 10 minutes ?
update A set Opening = i.invtrxpair from
@Temptable a
join (select warehouse, style, label, color, dimension, prepack, sum(invtrxpair) invtrxpair
from invtransactions
where arcurrentperiod < @Period and ARCurrentPeriod <> ''
group by warehouse, style, label, color, dimension, prepack) i
on i.warehouse = a.warehouse and i.style = a.style and i.label = a.label and i.color = a.color and
i.dimension = a.dimension and i.prepack = a.prepack
March 9, 2011 at 11:43 am
Very likely the lack of stats on the table variable, combined with a change in data distribution in the other tables. How many rows are you putting into that table variable?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 9, 2011 at 11:57 am
GilaMonster (3/9/2011)
How many rows are you putting into that table variable?
Currently, a little over 10100 :Whistling:. I know it's a lot for a table variable and I should never have used it in the first place but that's the same number of rows I was dealing with 6 days ago when it was working well.
March 9, 2011 at 12:13 pm
Way, way too much for a table variable. It's not just the row counts, it's also the row counts in the other tables, distribution of data in the other tables. Additionally there could have been a cached plan from a point where things were good, and that plan got thrown out.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 9, 2011 at 12:26 pm
Gail let me put things differently then. Should I be worried about this ? I'm really not concerned about the execution time since I fixed it with temp tables. What really bothers me is how it started taking 120 times longer just like that.
I'm left wondering if there's something wrong somewhere in my server that's just waiting for a reason to blow on me.
March 9, 2011 at 12:31 pm
Gagne (3/9/2011)
Gail let me put things differently then. Should I be worried about this ?
Other than as a lesson in performance tuning, no
What really bothers me is how it started taking 120 times longer just like that.
It happens. It's called a tipping point. Often you'll get performance that slowly degrades and then suddenly falls off a cliff. It's not unusual
I'm left wondering if there's something wrong somewhere in my server that's just waiting for a reason to blow on me.
Based just on this, no. This is far from unusual
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 9, 2011 at 12:36 pm
Then I guess all I have to do is go through all my procedures and get rid of those table variables if they're gonna hold more than a handful of records.
Thanks everyone for the help, it's deeply appreciated.
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply