March 26, 2008 at 7:39 am
I am trying to delete records from the table which has 60 millions records is taking forever.
Can some help me in making this process fast.
DECLARE
@START_DTDatetime
--SET @START_DT TO THE FIRST OF THE MONTH 6 MONTHS AGO
SET @START_DT = GETDATE() - 180
SET @START_DT = CAST(MONTH(@START_DT) AS VARCHAR(2)) +'/' + '01/' + CAST(YEAR(@START_DT) AS VARCHAR(4))
Declare @BatchSize int
Set @BatchSize = 100000 -- Modify as needed !!!
Set nocount on
declare @RowsDeleted bigint
Declare @MyRowcount bigint
set @RowsDeleted = 0
SET @MyRowcount = 0
while 0 = 0
begin
DELETE TOP (@BatchSize) table1
FROM table2
INNER JOIN table1 ON
table2.GROUP_ZZ = table1 .GROUP_ZZ AND
table2.PATIENT = table1 .PATIENT AND
table2.INVOICE = table1 .INVOICE
WHERE table2.SER_DT < @START_DT
set @MyRowcount = @@rowcount
if @MyRowcount = 0 break
select @RowsDeleted = @RowsDeleted + @MyRowcount
end
March 26, 2008 at 8:04 am
I've often found that modifying that kind of script to "walk the clustered key" works better.
Also - an "EXISTS" syntax might actually work better. It also makes sure that you're not seeing "multiple" instances of a given row from table1 due to one-to-many joins...
Meaning - structure it so that the chunks being deleted are as contiguous as possible.
For example - if you had an identity field, called ID on that table:
declare @startID int
set @startID=0;
declare @maxID int
select @maxID=max(id) from table1;
declare @batchsize int
set @batchsize = 10000; --modify as needed
WHILE @startid<@maxid
BEGIN
Delete from table1
Where
id between @startID and @startID+batchsize
and EXISTS (
select NULL
from table2
where
table2.GROUP_ZZ = table1 .GROUP_ZZ
AND table2.PATIENT = table1 .PATIENT
AND table2.INVOICE = table1 .INVOICE
AND table2.SER_DT < @START_DT
);
set @startID=@startid+@batchsize+1;
END
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
March 26, 2008 at 8:10 am
Try this on for size:
/*********************************************************************/
SET NOCOUNT ON
DECLARE @START_DT Datetime
DECLARE @BatchSize INT
DECLARE @RowsDeleted BIGINT
DECLARE @MyRowcount BIGINT
--SET @START_DT TO THE FIRST OF THE MONTH 6 MONTHS AGO
SET @START_DT = GETDATE() - 180
SET @START_DT = CAST(MONTH(@START_DT) AS VARCHAR(2)) +'/' + '01/' + CAST(YEAR(@START_DT) AS VARCHAR(4))
SET @BatchSize = 100000 -- Modify as needed !!!
SET @RowsDeleted = 0
SET @MyRowcount = -1
CREATE TABLE #t2 (
GROUP_ZZ ,
PATIENT ,
INVOICE
)
INSERT #t2 (GROUP_ZZ, PATIENT, INVOICE)
SELECT DISTINCT GROUP_ZZ, PATIENT, INVOICE
FROM table2
WHERE SER_DT < @START_DT
ORDER BY GROUP_ZZ, PATIENT, INVOICE
WHILE @MyRowCount <> 0
BEGIN
DELETE TOP (@BatchSize) table1
FROM #t2
INNER JOIN table1 ON #t2.GROUP_ZZ = table1 .GROUP_ZZ
AND #t2.PATIENT = table1.PATIENT
AND #t2.INVOICE = table1.INVOICE
SET @MyRowcount = @@ROWCOUNT
SET @RowsDeleted = @RowsDeleted + @MyRowcount
END
DROP TABLE #t2
SET NOCOUNT OFF
/*********************************************************************/
Hope this helps!
Cogiko ergo sum (I geek, therefore I am).
March 26, 2008 at 9:38 am
Also take a look at your referential integrity. If the table you are deleting from is the PK side of a foreign key relationship (a record must be in this table for a key in another table) the data in the other table much be checked for each record to make sure you have not just violated a constraint.
Sometimes you need to carefully remove or disable constraints before big delete jobs.
March 26, 2008 at 11:00 am
Thanks guys,
Matt idea is good but there is no identity column in the table.
I am trying David code and copy data which need to be delete into temp table and it took 14 minutes and then joining that temp table to real table and still going on.
I let you know how long it takes to delete if we join temp table.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy