July 7, 2009 at 9:20 pm
Hi expertz
During the past two days my tempdb log file grew almost to the size of the space available in my drive at least twice a day.
I moved the tempdb to another drive and today morning when i came i found that my templog got filled up to the size of the new drive and i was not able to shrink and some users complained they were not able to connect to the server since temp was full.
So i had to re-start the server.
What should i do to find the root cause of the issue?
What all events and data columns should i check when in run my Trace for the issue?
Why was i not able to shrink the templog using
"Dbcc shrinkfile('templog',100)" ?
What should i do avoid repetition of this issue?
Is restricting the growth of tempdb file a solution ?
Please help.........
Tanx 😀
July 7, 2009 at 10:14 pm
Eswin (7/7/2009)
What should i do to find the root cause of the issue?What all events and data columns should i check when in run my Trace for the issue?
Hi,
One of the root causes is
First find out the hash tables (temp tables) and replace it to the declare tables in the procedures/Reports.
ARUN SAS
July 7, 2009 at 10:33 pm
Thanks for the reply
arun.sas (7/7/2009)
replace it to the declare tables in the procedures/Reports.ARUN SAS
can you give more details.........
Tanx 😀
July 7, 2009 at 10:48 pm
Temporary Solution for your Problem is Delete the Temp db and Restart the SQL server. after that you change the temporary table where you used in a cursor or loop. Instead of Temporary table you make a Table declaration.
July 7, 2009 at 10:53 pm
arun.sas (7/7/2009)
Eswin (7/7/2009)
What should i do to find the root cause of the issue?What all events and data columns should i check when in run my Trace for the issue?
Hi,
One of the root causes is
First find out the hash tables (temp tables) and replace it to the declare tables in the procedures/Reports.
ARUN SAS
Hi,
/*Sample*/
CREATE Table #MyTable (
ID int,
Quantity int)
insert into #MyTable
...
...
select * from #MyTable
shows a million records
this data now in the tempDb
Where as we may use the declare tables like
declare @MyTable table
(
ID int,
Quantity int)
insert into @MyTable
...
...
select * from @MyTable
shows a million records..
now the data in the running DB
Few/more of the procedures having this hash tables,
You could find out these hash tables and replace it to declare table to reduce the tempDb hitting.
ARUN SAS
July 7, 2009 at 11:18 pm
I think there many limitations for using DECLARE @variable TABLE command in sql server 2000 and since mine is an 2000 server many people say its not advisable to use table variables.
Is there any other go.
What about restricting the growth of tempdb files?
Tanx 😀
July 8, 2009 at 1:50 am
Help please...........
Tanx 😀
July 8, 2009 at 1:50 am
Help please...........
Tanx 😀
July 8, 2009 at 8:53 pm
Help please...........
Tanx 😀
July 8, 2009 at 8:54 pm
Help please...........
Tanx 😀
July 10, 2009 at 8:20 pm
Try to run a profiler with filter on tempdb database and see what is consuming space there in ie.
Don't forget to include the events like:
Under Database Evnt Class
Data File Auto Grow
Log File Auto Grow
Include the batch completed, sp:starting and sql:stmtstarting events too etc etc.
MJ
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply