Inserting records in a table fills up TempDB

  • Hi,

    I am executing an Insert statement in to a table in a user database and this process is filling up tempDB for me. Source of data is another table within the same database.

    The TempDB data file increases to completely fill up server - about 30 GB and returns following error -

    Msg 1105, Level 17, State 2, Line 26

    Could not allocate space for object 'dbo.SORT temporary run storage: 141160416542720' in database 'tempdb' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.

    Number or records in tables are about 150,000

    system info -

    SQL 2008 R2

    INSERT statement -

    INSERT INTO dbo.TABLE_NAME(MESSAGE_ID, ADDOPER, ADDDATE, MASTER_ID_COLUMN, SUB_ID_COLUMN)

    SELECT DISTINCT

    M.MESSAGE_ID

    , M.ADDOPER

    , M.ADDDATE

    , CC.MASTER_ID_COLUMN

    , CC.SUB_ID_COLUMN

    FROM

    TABLE1 AS CC

    INNER JOIN TABLE2 AS M ON CC.FORMATTED_PHONE_ADDRESS = M.EMAIL_ADDRESS

    Table DDL -

    CREATE TABLE dbo.TABLE_NAME

    (

    MESSAGE_IDNUMERIC(12, 0) NOT NULL

    , MASTER_ID_COLUMNVARCHAR(10) NOT NULL

    , SUB_ID_COLUMNINTEGER NOT NULL

    , ADDOPERVARCHAR(10) NOT NULL

    , ADDDATEDATETIME NOT NULL

    , MODOPERVARCHAR(10) NULL

    , MODDATEDATETIME NULL

    , CONSTRAINT PK_TABLE_NAME PRIMARY KEY CLUSTERED

    (

    MESSAGE_ID ASC

    , MASTER_ID_COLUMN ASC

    , SUB_ID_COLUMN ASC

    )

    ) ON [PRIMARY]

    There are no other users on the database during this insert process.

    Any help would be appreciated.

  • What does the following query return?

    SELECT count(*)

    FROM TABLE1 AS CC

    INNER JOIN TABLE2 AS M ON CC.FORMATTED_PHONE_ADDRESS = M.EMAIL_ADDRESS

    “Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

  • I bet that the DISTINCT sort fills the tempdb, not the INSERT.

    -- Gianluca Sartori

  • Could not allocate space for object 'dbo.SORT temporary run storage: 141160416542720' in database 'tempdb' because the 'PRIMARY' filegroup is full.

    This message says it all.

    -- Gianluca Sartori

  • What can be done in situations like these where a few million records will be inserted in the table? I only want 1 record to be inserted in the table per matching record.

    Thanks,

    Vishal

  • So it sounds like you don't have enough RAM, so it has to be written to tempdb, and you don't have enough disk space, so the attempt to write to tempdb fails.

    You say you have a few million records you need to go through, but really only 150K records to update so I guess you have a lot of duplication. Without knowing more about what you have going on, it would be hard to give concrete suggestions. But, with what little I know of what exactly you're trying to do, I would make sure I had a good ordered index on the fields you need to group by and do what you're trying to do in batches rather than all at once. You would need some temporary table to keep track of which sections you had already processed to move on to the next TOP n DISTINCT rows that come after the last ones you did.

    Good luck.

    └> bt



    Forum Etiquette: How to post data/code on a forum to get the best help[/url]

  • Thanks for the input. That is what I am thinking of doing - update in batches.

  • Vishal Sinha (10/27/2011)


    What can be done in situations like these where a few million records will be inserted in the table? I only want 1 record to be inserted in the table per matching record.

    Thanks,

    Vishal

    Spend some time investigating the SELECT to find out if the number of output records can be reduced. The join looks a bit odd.

    “Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

  • Vishal Sinha (10/27/2011)


    Thanks for the input. That is what I am thinking of doing - update in batches.

    I'm pretty sure it won't help. Investigate the indexes instead, that SORT table comes from the DISTINCT in the SELECT query.

    The object that fills the tempdb could be the sort itself or a huge hash table.

    Another thing to investigate could be the degree of parallelism. How many logical CPUs are there? Have you set a MAXDOP for this instance? If not, you probably should.

    Does the query execute with OPTION (MAXDOP 1)?

    -- Gianluca Sartori

  • ChrisM@Work (10/27/2011)


    What does the following query return?

    SELECT count(*)

    FROM TABLE1 AS CC

    INNER JOIN TABLE2 AS M ON CC.FORMATTED_PHONE_ADDRESS = M.EMAIL_ADDRESS

    Vishal...

    Answer this question first. This will likely identify the key to your problem.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Sorry about not being clear earlier. It was trying to insert 10 million records.

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply