October 17, 2023 at 12:24 am
Total size of Tables=500 GB
Storage of DB shows 2.6 TB, Microsoft recommends run dbcc shrinkfile with low priority , It is not doing anything ,any ideas?
This is prod db , on 24 x7 , using Azure MI Business Critical
October 17, 2023 at 8:14 am
what exact command are you using for it?
October 17, 2023 at 11:01 am
Also what is the "log_reuse_wait_desc" from sys.databases for the database in question?
October 17, 2023 at 11:31 am
Correct, original question is a bit open to interpretation as to what is going on.
500GB in table but DB is 2.6TB, is that a 600GB MDF and a 2TB LDF, more information needed ideally
October 17, 2023 at 1:53 pm
datafile= 3 TB
logfile =200 MB
But size of all Tables =500 GB, what is taking 3 TB space, I need to free up the storage , to save $$$ ( storage on Azure MI ).
DBCC SHRINKFILE WITH LOW PRIORITY just keeps thinking for 2-3 hours and does not do anything!
October 17, 2023 at 3:22 pm
it can take days to shrink depending on how busy the db is.
and I would expand that command - prefer to be specific about which files get truncated and to what size.
following command should give you the name of the file to shrink - you may have more than one and if so you do the shrink for each file that needs it
SELECT file_id,
name,
CAST(FILEPROPERTY(name, 'SpaceUsed') AS bigint) * 8 / 1024. AS space_used_mb,
CAST(size AS bigint) * 8 / 1024. AS space_allocated_mb,
CAST(max_size AS bigint) * 8 / 1024. AS max_file_size_mb
FROM sys.database_files
WHERE type_desc IN (
'ROWS'
--,'LOG' -- excluding it as your size is rather small
);
the output of the above will give you a name , and a space_used_mb
use these on the following
--shrink_to_value below should be at least the space_used_mb * 1.2 (e.g. add 20% of existing space to it
DBCC SHRINKFILE ('enter_name_from_above', shrink_to_value) WITH WAIT_AT_LOW_PRIORITY (ABORT_AFTER_WAIT = SELF);
but it will take long to finish
October 18, 2023 at 8:35 am
It may be quicker to create a new database with the correct structure and size, and then copy data from the large DB to the correctly sized DB, and rename the two databases, you would need a maintenance window to do a delta load and switch the names around but generally when talking large shrinks, a new DB and copy can actually be a lot quicker to perform.
October 18, 2023 at 5:22 pm
Thank you! This is the only solution and best I think !
can it be done in batches, say every weekend for a month?
October 18, 2023 at 5:42 pm
Creating new db wont work, as there are 100's of tables, and millions of rows and its a live DB being used.
I would say its a microsoft defect expecting us to run dbcc shrinkfile , to release space to the OS , that too its in suspended state ,
That too this on Azure MI, Business critical, most expensive and latest/ greatest version of SQL Server!
October 18, 2023 at 6:14 pm
you can try shrink in 100 or 200 GB at the time - not advisable but would work -
the truncateonly is here in case you get lucky and there is a block on the files that is empty at the end - if that happens it will "remove" those faster than the normal shrink - and the shrinks following it may fail if the size specified is greater than the current size - but it will be faster on those.
DBCC SHRINKFILE ('enter_name_from_above', TRUNCATEONLY);
DBCC SHRINKFILE ('enter_name_from_above', 2800000) WITH WAIT_AT_LOW_PRIORITY (ABORT_AFTER_WAIT = SELF);
DBCC SHRINKFILE ('enter_name_from_above', TRUNCATEONLY);
DBCC SHRINKFILE ('enter_name_from_above', 2600000) WITH WAIT_AT_LOW_PRIORITY (ABORT_AFTER_WAIT = SELF);
DBCC SHRINKFILE ('enter_name_from_above', TRUNCATEONLY);
DBCC SHRINKFILE ('enter_name_from_above', 2400000) WITH WAIT_AT_LOW_PRIORITY (ABORT_AFTER_WAIT = SELF);
DBCC SHRINKFILE ('enter_name_from_above', TRUNCATEONLY);
DBCC SHRINKFILE ('enter_name_from_above', 2200000) WITH WAIT_AT_LOW_PRIORITY (ABORT_AFTER_WAIT = SELF);
DBCC SHRINKFILE ('enter_name_from_above', TRUNCATEONLY);
and so on until size is acceptable.
October 18, 2023 at 7:51 pm
be aware that depending on which MI SKU you have, that shrinking the database may reduce the disk performance available to the instance. I suspect the database is the size that it is for a reason.
October 18, 2023 at 9:45 pm
You can just shrink it a bit at a time. So instead of trying to shrinking it from 2.6 TB to 500 GB in one go just try making it 50 GB smaller than it's current size. If that works in a reasonable amount of time then try shrinking by a bit more.
November 9, 2023 at 8:35 pm
Just wanted to add that this isn't a "microsoft defect" as it is the job of the DBA to make sure things are properly sized. I wouldn't blame DELL if the laptop I bought had a 1 TB SSD and I filled it up. That's my fault for not buying enough disk OR for downloading too much junk. If the DBA setting this up thought that 2.6 TB is how much disk you needed and you only needed 500 GB, that is a mistake of the DBA, not a mistake on Microsoft's part.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
November 9, 2023 at 9:32 pm
Its managed SQL Server Business critical, with the latest version, and MS automated management of Log file.
The issue here is there is purge and archival of data creating lots of free space within the files , but it needs a manual cleanup by running dbcc shrinkfile which is irksome !
if they automated Always ON etc in Business critical, this should be a piece of cake!
Viewing 15 posts - 1 through 15 (of 15 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy