June 13, 2007 at 2:56 am
Hi ,
In my procedures i am using temporary tables to store intermediate results.
I am creating temporary tables by the following methods.
1. Select * into #xxx from xxx
2. Create table #xxx
which one is best method to create temporary table?
But some articles saying, Try to use table data instead of temporary table
karthik
June 13, 2007 at 3:18 am
The second option is the better one and works out faster when the data you need to dump is huge. * INTO is slower than the CREATE Table. But when you do a a bit of testing, the execution plan will show the same for small data and it will vary for huge chunks of data.
June 14, 2007 at 1:03 am
Sometimes I use a 'select * into', other times 'create table', but always followed by 'insert table'.
The reason being that whilst the 'select * into ' statement is processing the system tables in the tempdb are effectively locked for other updates. That is, other users or processes creating temp tables will wait.
The way to use the 'select * into' to limit the amount of time the exclusive lock occurs on the system tables on the tempdb is:
'select * into < temptable> from < table> where 1 = 0'
This effectively creates you table definition just like a 'create table'.
Then use the regulation 'insert table' statement.
As mentioned by Sudarsan , only use the 'select * into' for relatively small datasets.
More details are explained in Ken Henderson's book "The Guru's Guide to Transact-SQL" on page 301 under the topic of Optimising Transactional Code.
------
Robert
June 14, 2007 at 1:30 am
The @variable table is much better than #temp table and derived table much better than @variable table. And bulk inserts much better then iterative insert.
Thanks,
Irfan Baig.
June 15, 2007 at 7:28 am
That's a bit too broad of a generalization, Irfan. Table variables come with significant limitations that make them less performant than temp tables (when they are both used optimally). Also, the engine will have to stop and take the time to drop table variables down to tempdb anyway when they reach a certain size - which can also slow things down.
I definitely agree that it is often best to use a bit more complex logic to combine statements such that you don't need temporary storage in the first place and that bulk inserts are better than looping ones.
Addressing the OP, I have seen SELECT * INTO... usage really bring a SQL 2000- system to it's knees due to tempdb system object locking. Be sure to do scalability testing if you go down this road.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
June 15, 2007 at 12:33 pm
Hi SQL Guru...
Table variables exist in tempdb. There is no such animal as "dropping down to tempdb".
Check this out on a play server....
set nocount on
Create Table #T1 (tname sysname)
Create Table #T2 (tname sysname)
Insert #T1
select name
from tempdb.dbo.sysobjects
GO
declare @Foo table(fizzbin int)
declare @tname sysname
Insert #T2
select name
from tempdb.dbo.sysobjects
Select @tname = tname from #T2 where tname not in (Select tname from #T1)
Select sc.name as ColumnName from tempdb.dbo.syscolumns sc
inner join tempdb.dbo.sysobjects so on so.id = sc.id
where so.name = @tname
drop table #T1
drop table #T2
June 15, 2007 at 1:26 pm
I thought that table variables, while created as an object in tempdb like a temp table, contain all their data in memory given that sufficient memory exists - at which time they are written out to disk. Perhaps I misunderstand their internal workings.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
June 17, 2007 at 2:52 pm
You're right about table variables.
Temp tables behave exactly the same way.
While they are small they are in memory.
_____________
Code for TallyGenerator
June 17, 2007 at 7:18 pm
For some pretty good info on the differences between Temp Tables and Table Variables, check out the following URL... Q3/A3 and Q4/A4 seem to be the most valuable...
http://support.microsoft.com/default.aspx?scid=kb;en-us;305977&Product=sql2k
--Jeff Moden
Change is inevitable... Change for the better is not.
June 18, 2007 at 11:42 pm
Like rcavill, if I need to do a Select Into, I use 'select * into < temptable> from < table> where 1 = 0' followed by an insert into statement, but for a different reason.
There is an inherent problem with Select Into that could cause it to generate an error. When doing a Select Into, the new table is created based on the data values in the recordset being used. For example, in a string column, such as a varchar or nvarchar, if the column is varchar(1000), but the maximum length of the data in the column is only 200 characters, the column will be created with a smaller value.
If the varchar column contains only null values, it will try to create the column as a varchar(0) column causing the query to fail with an error.
I almost always use an explicit Create table statement. When using 'where 1 = 0', SQL Server will look at the underlying data structures to determine what the column should be:
June 19, 2007 at 4:58 am
Robert brought up some very good points and I just wanted to expound on them. There are two keys to good performance... one, of course, is the code folks write... the other is the underlying table schema and whether it's a temp table or a permanent table or a "temporary permanent" table, if the schema, data, or indexes are out of whack, performance will suffer.
I'll also add to what you said... a SELECT/INTO also makes your code more difficult to troubleshoot especially when generating an estimated execution plan for sanity checks... instead of just being able to materialize a table, you have to also populate the table. Yep, I know... you get a closer plan to fact if you have populated temp tables, but like I said, for sanity checks on big code, the SELECT/INTO's make life miserable.
--Jeff Moden
Change is inevitable... Change for the better is not.
June 19, 2007 at 9:51 am
I tend to agree with Jeff and try to create the table explicitly, then insert as needed, add indexes, etc.
I've seen so much on table variables v temp tables that I think it's inconclusive as far as which is better. To me as a longtime user of code, I would use a temptable if I needed to, but for people that are more on the development side, a table variable seems more natural. They are enhancing these in SQL Server 2008, so I might recommend spending more time practicing with table variables for this reason.
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply