November 19, 2013 at 10:58 am
Hi,
in a my stored procedure I need to use a DELETE statement and I cannot use the TRUNCATE TABLE.
The table source contains more tens of millions of rows and I need to eliminate some tens of millions of data.
The recovery model of the db is SIMPLE.
The DELETE statement is similar to this:
delete from work.dbo.WRK_FACT
where month >= @I and year = @year
Any ideas to improve the DELETE perfomances?
Thanks
November 19, 2013 at 11:07 am
Use a DELETE in batches.
DECLARE @i int = 1
WHILE @i > 0
BEGIN
DELETE TOP (1000000) work.dbo.WRK_FACT
where month >= @I and year = @year
SET @i = @@ROWCOUNT
BEGIN END
Or if you have a trigger on the table that might alter @@ROWCOUNT value.
WHILE EXISTS (
SELECT TOP 1 1
FROM work.dbo.WRK_FACT
where month >= @I and year = @year
)
BEGIN
DELETE TOP (1000000) work.dbo.WRK_FACT
where month >= @I and year = @year
BEGIN END
November 19, 2013 at 11:16 am
This was discussed just recently.
http://www.sqlservercentral.com/Forums/Topic1511574-3412-1.aspx#bm1511578
If you are deleting more than you are keeping you can populate a duplicate empty table with the rows you want to keep, delete the old table, rename the new table.
I know you said you can't use truncate, but you may wish to reconsider your reasons for not using it when it is the absolute fastest option available. You could populate a temp table with the rows you want to keep, truncate the table, then move the saved transactions from the temp table back to the original table.
__________________________________________________________________________________________________________
How to Post to get the most: http://www.sqlservercentral.com/articles/Best+Practices/61537/
November 19, 2013 at 2:39 pm
Hi LinksUp,
the batch/bunch approach for deletions rows it is very interesting. I want to avoid to manage a cursor for a such operation.
In my scenario, I need to populate a working table that represents the data source for a SSAS solution: I want to avoid to partition a working table.
In order to SELECT ... INTO, DROP old table and rename the new table, ALTER permissions are necessary.
My working table could countain 100 millions of records: 70 millions of rows could be deleted and maintained 30 millions of rows. In this case, using a temp table to fill is a good solution? However I could need to delete 40 millions of records and maintain 60 millions of records.
Ultimately, the batch approach seems the better solution ..., isn't it?
Thanks
November 19, 2013 at 2:42 pm
pmscorca (11/19/2013)
Ultimately, the batch approach seems the better solution ..., isn't it?
Test, test, and test!!
There are a number of techniques to populate a test table with millions of rows in a very short time. Then you can implement the different different scenarios to see which is the fastest, uses the fewest resources, has the best execution plan, etc. . .
__________________________________________________________________________________________________________
How to Post to get the most: http://www.sqlservercentral.com/articles/Best+Practices/61537/
November 19, 2013 at 2:56 pm
LinksUp (11/19/2013)
pmscorca (11/19/2013)
Ultimately, the batch approach seems the better solution ..., isn't it?Test, test, and test!!
There are a number of techniques to populate a test table with millions of rows in a very short time. Then you can implement the different different scenarios to see which is the fastest, uses the fewest resources, has the best execution plan, etc. . .
+1000
When dealing with performance improvements, the only sure way to choose is by testing and testing again after any changes on SQL Server.
November 19, 2013 at 2:56 pm
Ok.
However, I can post this article: http://social.technet.microsoft.com/wiki/contents/articles/20651.delete-huge-amount-of-data-from-a-table.aspx
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply