Temprary Table performance

  • Hi ,

    In my procedures i am using temporary tables to store intermediate results.

    I am creating temporary tables by the following methods.

    1. Select * into #xxx from xxx

    2. Create table #xxx

    which one is best method to create temporary table?

     

    But some articles saying, Try to use table data instead of temporary table

     

     

    karthik

  • The second option is the better one and works out faster when the data you need to dump is huge. * INTO is slower than the CREATE Table. But when you do a a bit of testing, the execution plan will show the same for small data and it will vary for huge chunks of data.

  • Sometimes I use a 'select * into', other times 'create table', but always followed by 'insert table'.

    The reason being that whilst the 'select * into ' statement is processing the system tables in the tempdb are effectively locked for other updates. That is, other users or processes creating temp tables will wait.

    The way to use the 'select * into' to limit the amount of time the exclusive lock occurs on the system tables on the tempdb is:

    'select * into &lt temptable&gt from &lt table&gt where 1 = 0'

    This effectively creates you table definition just like a 'create table'.

    Then use the regulation 'insert table' statement.

    As mentioned by Sudarsan , only use the 'select * into' for relatively small datasets.

    More details are explained in Ken Henderson's book "The Guru's Guide to Transact-SQL" on page 301 under the topic of Optimising Transactional Code.

    ------

    Robert

  • The @variable table is much better than #temp table and derived table much better than @variable table. And bulk inserts much better then iterative insert.

     

    Thanks,

    Irfan Baig.

  • That's a bit too broad of a generalization, Irfan.  Table variables come with significant limitations that make them less performant than temp tables (when they are both used optimally).  Also, the engine will have to stop and take the time to drop table variables down to tempdb anyway when they reach a certain size - which can also slow things down.

    I definitely agree that it is often best to use a bit more complex logic to combine statements such that you don't need temporary storage in the first place and that bulk inserts are better than looping ones. 

    Addressing the OP, I have seen SELECT * INTO... usage really bring a SQL 2000- system to it's knees due to tempdb system object locking.  Be sure to do scalability testing if you go down this road.

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

  • Hi SQL Guru...

    Table variables exist in tempdb.  There is no such animal as "dropping down to tempdb".

    Check this out on a play server....

    set nocount on

    Create Table #T1 (tname sysname)

    Create Table #T2 (tname sysname)

    Insert #T1

    select name

    from tempdb.dbo.sysobjects

    GO

    declare @Foo table(fizzbin int)

    declare @tname sysname

    Insert #T2

    select name

    from tempdb.dbo.sysobjects

    Select @tname = tname from #T2 where tname not in (Select tname from #T1)

    Select sc.name as ColumnName from tempdb.dbo.syscolumns sc

    inner join tempdb.dbo.sysobjects so on so.id = sc.id

    where so.name = @tname

    drop table #T1

    drop table #T2

     

     

  • I thought that table variables, while created as an object in tempdb like a temp table, contain all their data in memory given that sufficient memory exists - at which time they are written out to disk.  Perhaps I misunderstand their internal workings.

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

  • You're right about table variables.

    Temp tables behave exactly the same way.

    While they are small they are in memory.

    _____________
    Code for TallyGenerator

  • For some pretty good info on the differences between Temp Tables and Table Variables, check out the following URL... Q3/A3 and Q4/A4 seem to be the most valuable...

    http://support.microsoft.com/default.aspx?scid=kb;en-us;305977&Product=sql2k

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Like rcavill, if I need to do a Select Into, I use 'select * into < temptable> from < table> where 1 = 0' followed by an insert into statement, but for a different reason.

    There is an inherent problem with Select Into that could cause it to generate an error. When doing a Select Into, the new table is created based on the data values in the recordset being used. For example, in a string column, such as a varchar or nvarchar, if the column is varchar(1000), but the maximum length of the data in the column is only 200 characters, the column will be created with a smaller value.

    If the varchar column contains only null values, it will try to create the column as a varchar(0) column causing the query to fail with an error.

    I almost always use an explicit Create table statement. When using 'where 1 = 0', SQL Server will look at the underlying data structures to determine what the column should be:

    1. A Select Into statement doesn't copy over indexes, defaults, constraints, etc. So using the new table generated with a Select Into can cause degradated performance.
    2. A Select Into statement instead of a Create Table statement makes your code more difficult to follow. It is less obvious what you are doing unless the person looking at the code is attentive to every line.

     


    My blog: SQL Soldier[/url]
    SQL Server Best Practices:
    SQL Server Best Practices
    Twitter: @SQLSoldier
    My book: Pro SQL Server 2008 Mirroring[/url]
    Microsoft Certified Master: SQL Server, Data Platform MVP
    Database Engineer at BlueMountain Capital Management[/url]

  • Robert brought up some very good points and I just wanted to expound on them.  There are two keys to good performance... one, of course, is the code folks write... the other is the underlying table schema and whether it's a temp table or a permanent table or a "temporary permanent" table, if the schema, data, or indexes are out of whack, performance will suffer.

    I'll also add to what you said... a SELECT/INTO also makes your code more difficult to troubleshoot especially when generating an estimated execution plan for sanity checks... instead of just being able to materialize a table, you have to also populate the table.  Yep, I know... you get a closer plan to fact if you have populated temp tables, but like I said, for sanity checks on big code, the SELECT/INTO's make life miserable.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I tend to agree with Jeff and try to create the table explicitly, then insert as needed, add indexes, etc.

    I've seen so much on table variables v temp tables that I think it's inconclusive as far as which is better. To me as a longtime user of code, I would use a temptable if I needed to, but for people that are more on the development side, a table variable seems more natural. They are enhancing these in SQL Server 2008, so I might recommend spending more time practicing with table variables for this reason.

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply