Delete statment not in

  • I have 2 tables (table1 and table 2)that are joint by OrdId.Also I have table 3 with OrdId.

    I need to delete all OrdId in table1 and table 2 at the same time that are not in table 3.

    Can some one help me write a delete statment for that.

    Thank you very much

  • there are prettier ways to do this, but does this give you some ideas?

    DELETE FROM MyTable

    WHERE orderid IN(SELECT orderID From Table1 WHERE orderID IS NOT NULL)

    AND orderid IN(SELECT orderID From Table2 WHERE orderID IS NOT NULL)

    AND orderid NOT IN(SELECT orderID From Table3 WHERE orderID IS NOT NULL)

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • In a test environment, try this:

    delete from dbo.Table1

    from

    dbo.Table1 t1

    left outer join dbo.Table3 t3

    on (t1.OrdID = t3.OrdID)

    where

    t3.OrdID is null;

    delete from dbo.Table2

    from

    dbo.Table2 t2

    left outer join dbo.Table3 t3

    on (t2.OrdID = t3.OrdID)

    where

    t3.OrdID is null;

  • I have around 190 million records.What would be the quickest way to delete them?

  • Try to delete in batches.

    Example:-

    DECLARE @rc int

    SET @rc = 1

    SET ROWCOUNT 10000--Batch Size based on what is fine for you

    WHILE @rc > 0

    BEGIN

    DELETE FROM Tablename

    WHERE <your criteria>

    SET @rc = @@ROWCOUNT

    WAITFOR DELAY '00:00:02.000' -- to let other processes do their job

    END

    MJ

  • With that many rows you may also consider 2 more things :

    How many rows will be left after the delete? Sometimes it's just faster to figure out the good rows, insert those rows into a new table. rename old table, rename new table to old name. Drop original table after you backed it up.

    If you go with the batches delete, the log will grow as much as you delete data. This can cause a lot more problems. You may also do a log backup every X loops to keep the log from growing and lockin gthe transactions for that time.

  • Ninja's_RGR'us (1/18/2010)


    With that many rows you may also consider 2 more things :

    How many rows will be left after the delete? Sometimes it's just faster to figure out the good rows, insert those rows into a new table. rename old table, rename new table to old name. Drop original table after you backed it up.

    If you go with the batches delete, the log will grow as much as you delete data. This can cause a lot more problems. You may also do a log backup every X loops to keep the log from growing and lockin gthe transactions for that time.

    As mentioned above, how many records are going to be left? I agree, it may be better to copy out the good records, truncate the tables, and reload.

    If you must still delete records, please read this article[/url].

  • I can't re name table, I need to leave 200 million records and that table and I need to delete 190 millions records.

    I created store procedure:

    BEGIN

    SET ROWCOUNT 1000000;

    BEGIN TRANSACTION

    delete dbo.table1

    FROM dbo.ext_table1INNER JOIN

    dbo.table2ON dbo.table1.OrderID = dbo.table2.OrderID;

    COMMIT

    END

    It takes 1 hour and 15 min to delete. Do you have any other suggestions, how faster I can do it? I checked data logs it is not getting bigger. Thank you

  • Krasavita (1/19/2010)


    I can't re name table, I need to leave 200 million records and that table and I need to delete 190 millions records.

    I created store procedure:

    BEGIN

    SET ROWCOUNT 1000000;

    BEGIN TRANSACTION

    delete dbo.table1

    FROM dbo.ext_table1INNER JOIN

    dbo.table2ON dbo.table1.OrderID = dbo.table2.OrderID;

    COMMIT

    END

    It takes 1 hour and 15 min to delete. Do you have any other suggestions, how faster I can do it? I checked data logs it is not getting bigger. Thank you

    Not much else you can do since you need to keep 200,000,000 rows in the table after deleting 190,000,000 rows. Best bet, do it in batches.

  • Ya, just make sure you keep backing up the log so you don't run out of space.

    You may also test with smaller batches to see which is faster..

    Also another way that could be faster would be to do a nolock select and insert into a temp table which would only contain the keys to the rows that need to be deleted. Then get top 100 000 from that table and join to the main table for delete.

    Backup log every 5 runs maybe.

  • Thank you

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply