January 15, 2010 at 10:39 am
I have 2 tables (table1 and table 2)that are joint by OrdId.Also I have table 3 with OrdId.
I need to delete all OrdId in table1 and table 2 at the same time that are not in table 3.
Can some one help me write a delete statment for that.
Thank you very much
January 15, 2010 at 10:48 am
there are prettier ways to do this, but does this give you some ideas?
DELETE FROM MyTable
WHERE orderid IN(SELECT orderID From Table1 WHERE orderID IS NOT NULL)
AND orderid IN(SELECT orderID From Table2 WHERE orderID IS NOT NULL)
AND orderid NOT IN(SELECT orderID From Table3 WHERE orderID IS NOT NULL)
Lowell
January 15, 2010 at 10:52 am
In a test environment, try this:
delete from dbo.Table1
from
dbo.Table1 t1
left outer join dbo.Table3 t3
on (t1.OrdID = t3.OrdID)
where
t3.OrdID is null;
delete from dbo.Table2
from
dbo.Table2 t2
left outer join dbo.Table3 t3
on (t2.OrdID = t3.OrdID)
where
t3.OrdID is null;
January 18, 2010 at 6:47 am
I have around 190 million records.What would be the quickest way to delete them?
January 18, 2010 at 2:23 pm
January 18, 2010 at 2:33 pm
With that many rows you may also consider 2 more things :
How many rows will be left after the delete? Sometimes it's just faster to figure out the good rows, insert those rows into a new table. rename old table, rename new table to old name. Drop original table after you backed it up.
If you go with the batches delete, the log will grow as much as you delete data. This can cause a lot more problems. You may also do a log backup every X loops to keep the log from growing and lockin gthe transactions for that time.
January 18, 2010 at 2:46 pm
Ninja's_RGR'us (1/18/2010)
With that many rows you may also consider 2 more things :How many rows will be left after the delete? Sometimes it's just faster to figure out the good rows, insert those rows into a new table. rename old table, rename new table to old name. Drop original table after you backed it up.
If you go with the batches delete, the log will grow as much as you delete data. This can cause a lot more problems. You may also do a log backup every X loops to keep the log from growing and lockin gthe transactions for that time.
As mentioned above, how many records are going to be left? I agree, it may be better to copy out the good records, truncate the tables, and reload.
If you must still delete records, please read this article[/url].
January 19, 2010 at 5:57 am
I can't re name table, I need to leave 200 million records and that table and I need to delete 190 millions records.
I created store procedure:
BEGIN
SET ROWCOUNT 1000000;
BEGIN TRANSACTION
delete dbo.table1
FROM dbo.ext_table1INNER JOIN
dbo.table2ON dbo.table1.OrderID = dbo.table2.OrderID;
COMMIT
END
It takes 1 hour and 15 min to delete. Do you have any other suggestions, how faster I can do it? I checked data logs it is not getting bigger. Thank you
January 19, 2010 at 7:17 am
Krasavita (1/19/2010)
I can't re name table, I need to leave 200 million records and that table and I need to delete 190 millions records.I created store procedure:
BEGIN
SET ROWCOUNT 1000000;
BEGIN TRANSACTION
delete dbo.table1
FROM dbo.ext_table1INNER JOIN
dbo.table2ON dbo.table1.OrderID = dbo.table2.OrderID;
COMMIT
END
It takes 1 hour and 15 min to delete. Do you have any other suggestions, how faster I can do it? I checked data logs it is not getting bigger. Thank you
Not much else you can do since you need to keep 200,000,000 rows in the table after deleting 190,000,000 rows. Best bet, do it in batches.
January 19, 2010 at 7:45 am
Ya, just make sure you keep backing up the log so you don't run out of space.
You may also test with smaller batches to see which is faster..
Also another way that could be faster would be to do a nolock select and insert into a temp table which would only contain the keys to the rows that need to be deleted. Then get top 100 000 from that table and join to the main table for delete.
Backup log every 5 runs maybe.
January 19, 2010 at 10:32 am
Thank you
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply