June 17, 2011 at 4:01 am
You've got an I/O issue by the sounds. What type of storage config are you using? As Gail's stated you'll need to spread across physical spindles (which are not necessarily logical drives) to get a win here.
Also, depends on the code that's being written by your developers.
Finally recommend you get your patch level up to SP4.
Carlton.
June 17, 2011 at 4:04 am
sqlnes (6/17/2011)
(i think if the file was too big will affect preformance).
Nope.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
June 17, 2011 at 4:35 am
sqlnes (6/16/2011)
We have heavy IO on tempdb right now and the size of tempdb grow rapidly. ... Our programmers use temp table a lot.
Hey, be careful about this correlation.
Temp tables aren't the only things that cause heavy IO on the tempdb. Table variables, table re-indexing, etc. can all cause IO contention on the temp db.
Tempdb does not just exist for temporary table storage. It does a heck of a lot of things for SQL Server. So before you go telling your Devs not to use temporary tables (or stop using them yourself because of this assumption), consider running PerfMon and a server side trace to see what else could be causing the issue.
And consider the fact that it could also be a disk controller issue if your tempdb is on a different physical drive than your other DBs.
June 17, 2011 at 5:33 am
Jayanth_Kurup (6/17/2011)
try using ctes
Oh, be careful, now. That can actually make the problem worse especially if the data in the result set of the CTE is needed more than once. A second call to the CTE would be necessary from the same query and that means that, just like a view, the CTE would be executed a second time making the IO contention even worse. CTE's can also use TempDB as much or more than Temp Tables when they make "work tables" behind the scenes.
Making any changes to the code without knowing exactly what the problem is isn't something that I'd recommend. I will agree, however, that poorly written code is usually (there are exceptions, of course) the reason for severe contention for resources.
--Jeff Moden
Change is inevitable... Change for the better is not.
June 17, 2011 at 5:47 am
Brandie Tarvin (6/16/2011)
However, a lot of people I know use staging tables that they create and drop at need in a user db. These are permanent tables, even though they are only temporarily kept in the db. I don't know if that helps you or not.
You do have to be careful with that. Depending on the disk configuration, you may simply be transferring IO contention problems from TempDB to the local DB. Further, creating such tables may make for another type of contention: Unless you use dynamic SQL to name the tables with a suffix (for example, the SPID number), you've just made it impossible for the same sproc to successfully run more than once simultaneously. Depending on people's "temp" table naming conventions from sproc to sproc, you may have made it so multiple disparate sprocs may not be able to successfully run concurrently. 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
June 17, 2011 at 5:51 am
Carlton Leach (6/17/2011)
Also, depends on the code that's being written by your developers.
Exactly. 🙂 "Accidental Cross-Joins" (think many-to-many joins frequently overcome by the use of DISTINCT or GROUP BY) due to poor DB design, a misuderstanding of the data, or a simple lack of the proper join criteria are a leading cause of TempDB contention (to name just one of many possible reasons).
--Jeff Moden
Change is inevitable... Change for the better is not.
June 17, 2011 at 7:46 am
Jeff Moden (6/17/2011)
Brandie Tarvin (6/16/2011)
However, a lot of people I know use staging tables that they create and drop at need in a user db. These are permanent tables, even though they are only temporarily kept in the db. I don't know if that helps you or not.You do have to be careful with that. Depending on the disk configuration, you may simply be transferring IO contention problems from TempDB to the local DB.
I do agree. That's why I didn't say I recommended it. Just that some people I know use that technique rather than using temporary tables.
No one has ever been able to give me a good reason why they do it, either.
June 17, 2011 at 7:51 am
Jeff Moden (6/17/2011)
Brandie Tarvin (6/16/2011)
However, a lot of people I know use staging tables that they create and drop at need in a user db. These are permanent tables, even though they are only temporarily kept in the db. I don't know if that helps you or not.You do have to be careful with that. Depending on the disk configuration, you may simply be transferring IO contention problems from TempDB to the local DB. Further, creating such tables may make for another type of contention: Unless you use dynamic SQL to name the tables with a suffix (for example, the SPID number), you've just made it impossible for the same sproc to successfully run more than once simultaneously. Depending on people's "temp" table naming conventions from sproc to sproc, you may have made it so multiple disparate sprocs may not be able to successfully run concurrently. 😉
Or, even worse, they can run conncurrently, but not on the data they think they are running on. Proc1 ends up running on Proc2's data, and so on.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
June 17, 2011 at 9:09 am
HI, try looking up variable temp tables, as i understand it these work from memory however, i do think that there is a limit to the amount of data that can be populated into a variable temp table before it falls over.
the good thing about variable temp tables is that as soon as the execution is complete the table no longer exists.
eg
run this
declare @TableVar table (CustomerID int identity,names varchar(20))
insert into @TableVar select 'test1'
then run
select * from @TableVar
you will get
Server: Msg 137, Level 15, State 2, Line 1
Must declare the variable '@TableVar'.
run all 3 together
declare @TableVar table (CustomerID int identity,names varchar(20))
insert into @TableVar select 'test1'
select * from @TableVar
result
customer id names
1 test1
***The first step is always the hardest *******
June 17, 2011 at 10:33 am
glen.wass (6/17/2011)
HI, try looking up variable temp tables, as i understand it these work from memory however, i do think that there is a limit to the amount of data that can be populated into a variable temp table before it falls over.
No limit that I've heard of. Well, aside from the available memory and disk space on your server, that is.
June 17, 2011 at 11:13 am
glen.wass (6/17/2011)
HI, try looking up variable temp tables, as i understand it these work from memory
Nope. Same rules as temp tables as to whether they are in memory or spilt to disk.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
June 17, 2011 at 4:48 pm
glen.wass (6/17/2011)
HI, try looking up variable temp tables, as i understand it these work from memory...
It's a myth... Table Variables aren't "memory only" (which is what I believe you were inferring... apologies if not). Both Table Variables and Temp Tables start out in memory and switch to TempDB disk space when there's not enough room.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 12 posts - 16 through 26 (of 26 total)
You must be logged in to reply to this topic. Login to reply