Overactive use of Temp Tables?

  • I have a .NET application I wrote that collects data from machines and updates a database with summary count data every 30 seconds (business requirement to be near realtime).

    To support this, during every 30-second cycle, my application creates a temp table, inserts the detail records into it (about five hundred or so) then summarizes that into another temp table, then updates the permanent table, incrementing all the related records with the values stored in the summary temp table.

    This has been working and performing fine, but in reviewing my query plan stats and after running a trace, I see that this application is not only saturating my default trace file with "Object Created" events (one for each temp table and 6 or 7 more for each auto-created statistics object) but is also recompiling the statements every time (every 30 seconds!).

    The statements are apparently recompiling because they are using a new temp table every time, which I guess makes sense. (Reason code in sql trace is "temp table changed")

    I considered using table variables instead of temp tables so that I could avoid the auto-stats and see if that helps my recompiles. But in .NET I can't figure out how to perform a bulk insert into a table variable and still have access to the table in a subsequent command (it goes out of scope and disappears).

    So my next option is to just use permanent tables. But if I'm filling and clearing them every 30 seconds, I'm going to end up with a huge transaction log.

    Any recommendations?

  • What you are seeing with the table variables is expected. Table variables are scoped to the SQL batch.

    Why not eliminate the temp tables completely and insert into your destination table? What are you doing to the data that is so complex that you'd need to stage it first?

    John Rowan

    ======================================================
    ======================================================
    Forum Etiquette: How to post data/code on a forum to get the best help[/url] - by Jeff Moden

  • The permanent table needs to have several hundered records either updated (if they exist) or inserted. In situations like this I always try to stage the data in a temp table first so that I can just perform the mass insert/update using just two simple join queries against the temp table, instead of several hundred individual insert/update queries.

    This has most certainly been the best approach for me in the past in a data warehouse where you're doing bulk loads of several million records. I'm starting to doubt that it was the best approach here though.

  • If you are going the table variable route, I would create a stored procedure and then call the stored procedure from .Net.

    Other Options would be a CTE.

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

  • But there is no way to do a bulk insert (from in-memory data) with a stored procedure.

  • Why not use the same tables every time but truncate them before each cycle?

  • Glenn Dorling (3/31/2010)


    Why not use the same tables every time but truncate them before each cycle?

    If you are doing this numerous times a day, this seems to be the most efficient method.

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

  • Usually, "It depends". This time, "It doesn't matter". 😉

    Even without the Temp Table and going direct, you'd likely get a recompile everytime anyway because of the amount of data that get's put into the final table every run.

    The other thing is, this is a batch job that only runs once every 30 seconds. What makes you think a recompile is a bad thing here?

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Initially, I would agree that it doesn't seem to matter, since the application seems to be running well and does not appear to be putting undue stress on the server. However, this one application (which represents 1% of all the applications that access this server) represents over 70% of the activity in my default trace logs. I really just want to "clean it up" so that it doesn't fill my logs with so much noise.

  • Allow me to step outside the box here and ask why you don't 'sumarize' the data in the .NET application before it ever hits the database. Then just insert the final stuff you need to insert. Poof! MUCH better performance on sql server, less network traffic, and scale-out capabilities as well.

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

  • I'm actually considering that approach (see my response to John Rowan earlier). The only downside is I have to perform several hundred if exists then update ... else insert ...commands. I do suppose though that it's the lesser of two evils.

  • How does your code populate the temp tables? By the look of your last post would it be a loop or cursor? Please forgive me if I am wrong.

    Have you considered using XML to perform the data load?

    ...and your only reply is slàinte mhath

  • bobmclaren


    My next option is to just use permanent tables. But if I'm filling and clearing them every 30 seconds, I'm going to end up with a huge transaction log.

    Load the permanent tables using minimally-logged BULK INSERT (or bcp, or similar). Process the data, and then truncate the permanent tables. TRUNCATE TABLE is always 'minimally logged'.

    Normally, this approach would still result in large log backups (not large log files!) because SQL Server has to back up any data pages affected by minimally-logged operations, in the following log backup, for recovery reasons. SQL Server uses a map called the Bulk-Changed Map (BCM) to keep track of data pages that have been modified by minimally-logged operations since the last log backup.

    However, if you can bulk load, process, and truncate before the log backup is taken, the BCM will likely have been cleared for that object by the truncate, and the data pages affected will not be backed up to the log, as would normally happen.

    I say 'likely' because the de-allocations performed by TRUNCATE will be performed asynchronously, on a background thread, if the table uses 128 or more extents (8MB).

    See:

    Operations That Can Be Minimally Logged

    Prerequisites for Minimal Logging in Bulk Import

    There are plenty of other options here, including SSIS packages to consume and pre-aggregate the data, I mention the above approach because (a) it is technically interesting; and (b) it can be done quickly and completely within the database.

    Paul

  • bobmclaren (4/1/2010)


    I really just want to "clean it up" so that it doesn't fill my logs with so much noise.

    If this is the only reason that you are looking to rework the code then another option would be to consider turning off the default trace, scripting it, amending the script to suit your requirements (for instance excluding queries issued by your app, increasing file size/count etc...) then creating a startup stored proc to define and start your new trace.

    1.A recent article on SQL Server Central described how to script a server-side trace (including the default trace):

    http://www.sqlservercentral.com/articles/trace/69559/

    2.You can disable the default trace by amending the "default trace enabled" server option using sp_configure.

    3.You can mark a stored procedure in the 'master' database as being a startup procedure:

    http://msdn.microsoft.com/en-us/library/ms181720.aspx

    Hope this helps,

    Chris

  • Chris Howarth-536003 (4/7/2010)


    If this is the only reason that you are looking to rework the code then I would consider turning off the default trace, scripting it, amending the script to suit your requirements (for instance excluding queries issued by your app, increasing file size/count etc...) then create a startup stored proc to define and start your new trace.

    I see your point Chris.

    Mind you, even if that is the main reason, I would still go further and look at using permanent tables (by whatever method, maybe SSIS) because tempdb tends to be quite busy enough as it is.

Viewing 15 posts - 1 through 15 (of 21 total)

You must be logged in to reply to this topic. Login to reply