Temp table not deleted when exec SQL string

  • When I try to run below script Getting error :

    There is already an object named '#mytemp' in the database.

    any Idea why and how to fix this.

    declare @sqlquery varchar(max)

    SET @sqlquery ='

    IF object_id(''tempdb..#mytemp'') IS NOT NULL

    DROP TABLE #mytemp;

    SELECT T517.* INTO #mytemp

    FROM T517 WITH (NOLOCK)

    TRUNCATE TABLE T517;

    INSERT INTO T517 WITH (TABLOCK) (UserID, AttributeValue, CreationTime)

    SELECT UserID, AttributeValue, CreationTime FROM #mytemp;

    IF object_id(''tempdb..#mytemp'') IS NOT NULL

    DROP TABLE #mytemp;

    SELECT T518.* INTO #mytemp

    FROM T518 WITH (NOLOCK)

    TRUNCATE TABLE T518;

    INSERT INTO T518 WITH (TABLOCK) (UserID, AttributeValue, CreationTime)

    SELECT UserID, AttributeValue, CreationTime FROM #mytemp;

    IF object_id(''tempdb..#mytemp'') IS NOT NULL

    DROP TABLE #mytemp;

    SELECT T519.* INTO #mytemp

    FROM T519 WITH (NOLOCK)

    TRUNCATE TABLE T519;

    INSERT INTO T519 WITH (TABLOCK) (UserID, AttributeValue, CreationTime)

    SELECT UserID, AttributeValue, CreationTime FROM #mytemp;

    '

    exec (@sqlQuery);

  • I think I saw this before. IIRC this is more of a compile time issue.

    Can you try ignoring the drops and doing #Mytemp1,2 & 3?

  • When it tries to parse/compile the command, it runs into the fact that the table is already created by an earlier statement. In a simple script, you can overcome that by using batch separators (usually "GO"), but you can't do that in dynamic SQL.

    On the other hand, why add that level of complexity? In this script, you already know that the table exists with that name. If the name were dynamic, what you've written would make more sense. But it isn't. So just give each temp table a different name. Maybe "#MyTable1", "#MyTable2", or something like that.

    You don't need the If Object_ID...Drop commands in there if you do that. Eliminates whole levels of complexity from this.

    For that matter, why is this dynamic SQL? Am I missing some place where data determines objects or something like that?

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • I am of the same opinion as Gus. Simplify this query. I am not seeing a reason from the code provided to have it be dynamic. If that is truly the case, why not change it?

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

  • like query is not much simple here I just create simple query to show the issue.

    Is there any work around for this or I can create new temp table for every set.

  • inayatkhan (11/29/2011)


    like query is not much simple here I just create simple query to show the issue.

    Is there any work around for this or I can create new temp table for every set.

    New temp table should work.

  • inayatkhan (11/29/2011)


    like query is not much simple here I just create simple query to show the issue.

    Is there any work around for this or I can create new temp table for every set.

    New temp table for each. Or, if all the temp tables will have the same columns, you could re-use one and just truncate and re-load each time.

    I prefer a different temp table for each, since that avoids some command accidentally leaving the last set's data in the table and causing false results.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply