November 17, 2005 at 4:28 am
Hi all,
We have a requirement from our client, that we have to delete all the respective information of an existing sub client for them..
There are about 59-60 tables that contains the related information,
so we have to delete from all these 59 tables, the mataching records.
The Approach that we have decided, goes like this, Initially we are fetcing up the respective customer Id's from the primary table. (a total of 4,00,000 records) to the temporary Parent table created by us.
Secondly we are fetching only 100 records from the Parent table and inserting into the Temporary Child table.
Then we have constructed a loop in such a way that, it takes that 100 records from the child table and starts deleting from the respective 59 tables.
This process what we have adopted takes around 2-3 mins per loop
(Since the table count is more, ie 59 and the records in each table crosses more than 60,000). Since considering the overall records of 4,00,000, it goes beyound days together.!!!!. Which is not accepted by our client.
If there is any other approach ?? that we can opt for in this kind of scenario to make the Purge process, more faster and efficient.
I have a lot of confidence in this group, and I exprienced the same in the past.
It would be great, if I got any help regarding this.
My Logic looks like below...
WHILE (noOfrecords in Parent table >0)
BEGIN
SET ROWCOUNT 100
INSERT 100 records in the child table from Parent table.
DELETE FROM Table1
FROM childtable MP,table TB
WHERE MP.clientID = TB.ClientID
DELETE FROM Table1
FROM childtable MP,table TB
WHERE MP.clientID = TB.ClientID
DELETE FROM Table1
FROM childtable MP,table TB
WHERE MP.clientID = TB.ClientID
END
Kindly Let me know if any more information needed.
Thanks And regards,
Prabu.P
November 17, 2005 at 5:35 am
Given that you have so many tables, I suspect that the relationships will be pretty complex. Without knowing the relationships between all the tables, it is difficult to get specific.
However, have you thought about turning on cascading deletes - this will allow SQL Server to do all of the deletion work for you (once you have the relationships set up correctly, which will take a while).
Eg - you execute:
delete from customer where customerid in (select id from IDsToDelete)
and the cascading deletes get rid of all the associated records.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
November 18, 2005 at 8:04 am
I can tell you two things:
1. You can probably do more than 100 records at a time. When I did this I found 10-14000 worked relatively quickly on deletes and reduced the overhead associated with only 100. Note: This was done on a live database with deletes from the database taking less than 1 minute.
2. Either make sure your transaction log and data files are big enough to handle the deletes/inserts (and being backed up regularly) or change the model to simple reocvery and don't log the actions, but be sure to take a downtime, backup the db before and after.
Note: That cascade deletes will trigger of a delete for each sub table so they can be slow. You might want to try dropping referential integrity to speed up the deletes especially if there are a lot of FK involved.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply