September 18, 2017 at 2:51 am
We have the SSISDB database which stores log information of jobs that are run. It was initially set to retain 365 days worth of data, and this quickly grew out of hand (over 25Gb in size), which was getting backed up daily for 45 days. This filled up our drive for backups.
I have decreased the retention period to 45 days and the maximum number of version per project is set to 10. As a quick measure to get some free space on the drive so other backups could run, we deleted many old .bak files for the SSISDB database that were not needed.
The maintenance plan created automatically by SSIS ran last night to delete information over 45 days (it took nearly 2 hours to complete), however the database is still huge. I ran a maintenance package to shrink the database, then a transaction backup (which usually helps to decrease the database) but this did not seem to help this time.
Currently the database is 22656MB, and there is 16197MB of free space. The maintenance plan set to shrink the database if it grows beyond 500Mb, retain 5% free space after shrink, and return freed space to operating system.
What else can I do to get rid of this (nearly) 16Gb worth of free space being hogged by the database?
Thanks
September 18, 2017 at 4:36 am
Is the free space in the data or the log file?
If its the log, what is the status of log_reuse_wait_desc from sys.databases?
If its the data file, shrink it manually in a range of chunks till you get to your target size.
I have never liked the maintenance plan shrinks to do what you tell them to do.
September 19, 2017 at 4:58 am
anthony.green - Monday, September 18, 2017 4:36 AMIs the free space in the data or the log file?If its the log, what is the status of log_reuse_wait_desc from sys.databases?
If its the data file, shrink it manually in a range of chunks till you get to your target size.
I have never liked the maintenance plan shrinks to do what you tell them to do.
USE SSISDB
GO
sp_spaceused
GO
However the 16Gb of free space still remains in the database
September 19, 2017 at 2:55 pm
r.gall - Tuesday, September 19, 2017 4:58 AManthony.green - Monday, September 18, 2017 4:36 AMIs the free space in the data or the log file?If its the log, what is the status of log_reuse_wait_desc from sys.databases?
If its the data file, shrink it manually in a range of chunks till you get to your target size.
I have never liked the maintenance plan shrinks to do what you tell them to do.
Hi
Tthe status of log_reuse_wait_desc is "2".
If I runUSE SSISDB
GO
sp_spaceused
GO
I get:
database_name = SSISDB
database_size = 21743.50MB
unallocated space = 16189.50MBThis morning I have:
1) run the maintenance plan generated by SSMS automatically (called SSIS Server Maintenance Job)
2) run my own maintenance package which is set to check the database integrity and shrink the database if it grows more than 500Mb, keep 10% free space, return unused space to operating system
3) run a transaction backup
4) run a full backupHowever the 16Gb of free space still remains in the database
One of the things you'll really want to do is get out of the practice of shrinking databases - it causes more harm and isn't a good thing. You can find some of the issues and a list of related articles on why you don't want to be shrinking databases in the following:
Stop Shrinking Your Database Files. Seriously. Now.
It's something you really do not want to do regularly. At times, you may have to due to something like a large process just blew up the database, it doesn't need the space at all, it was a one time thing and it isn't going to grow again. That's the kind of situation where you shrink the log. If you read that article and the links in there, it will only cause fragmentation. If you do need to do a one time, once in a very great while shrink, you want to be working on files instead of the database as you have more control over what is being sized to what.
You wrote you shrink the database every time it grows more than 500 MB. If it keeps doing that, it means the database needs that space. So let the size sit at what it needs for it's regular activity.
How the size and space is managed depends some on your recovery model. The default recovery model for SSISDB is full. That means you need to be doing regular log backups to keep the size manageable. It may be a few times a day or every five minutes - it really depends on the amount of activity for that database. If your package aren't changing frequently, it may be better to have the recovery model as simple. It depends on your recovery needs as to whether it should be full or simple.
Then in terms of the space, as I said, you would want to manage that by files. And make sure you need to shrink a file as it has grown way out of whack, won't happen again and won't grow to the size it's at now. So if that's the case, you would want to check the space usage of the individual files. A quick and dirty way to just check the SSISDB database would be something like: USE SSISDB
GO
SELECT DB_NAME() AS DBName,
[name] AS FileName,
CAST(size/128.0 as DECIMAL(10,2)) CurrentSizeMB,
CAST((CAST(size/128.0 as DECIMAL(10,2)) -
CAST(FILEPROPERTY(name,'SpaceUsed')AS INT)/128.0) as DECIMAL(10,2)) AS FreeSpaceMB
FROM sys.database_files;
If you do have a file that is excessively large and doesn't need to be that size, you would then use DBCC Shrinkfile to reduce the size of that file.
If 16GB is large for your company and you have too much space taken up by that then you could likely be incredibly low on drive space. That in and of itself is likely a risk to the company. But if you have to get that space back, you can use that script to see which file has all the free space. If it's the log then you likely need to schedule regular log backs and would also want to evaluate as to whether simple recovery works better for this database.
Sue
September 20, 2017 at 2:55 am
Sue_H - Tuesday, September 19, 2017 2:55 PMr.gall - Tuesday, September 19, 2017 4:58 AManthony.green - Monday, September 18, 2017 4:36 AMIs the free space in the data or the log file?If its the log, what is the status of log_reuse_wait_desc from sys.databases?
If its the data file, shrink it manually in a range of chunks till you get to your target size.
I have never liked the maintenance plan shrinks to do what you tell them to do.
Hi
Tthe status of log_reuse_wait_desc is "2".
If I runUSE SSISDB
GO
sp_spaceused
GO
I get:
database_name = SSISDB
database_size = 21743.50MB
unallocated space = 16189.50MBThis morning I have:
1) run the maintenance plan generated by SSMS automatically (called SSIS Server Maintenance Job)
2) run my own maintenance package which is set to check the database integrity and shrink the database if it grows more than 500Mb, keep 10% free space, return unused space to operating system
3) run a transaction backup
4) run a full backupHowever the 16Gb of free space still remains in the database
One of the things you'll really want to do is get out of the practice of shrinking databases - it causes more harm and isn't a good thing. You can find some of the issues and a list of related articles on why you don't want to be shrinking databases in the following:
Stop Shrinking Your Database Files. Seriously. Now.It's something you really do not want to do regularly. At times, you may have to due to something like a large process just blew up the database, it doesn't need the space at all, it was a one time thing and it isn't going to grow again. That's the kind of situation where you shrink the log. If you read that article and the links in there, it will only cause fragmentation. If you do need to do a one time, once in a very great while shrink, you want to be working on files instead of the database as you have more control over what is being sized to what.
You wrote you shrink the database every time it grows more than 500 MB. If it keeps doing that, it means the database needs that space. So let the size sit at what it needs for it's regular activity.
How the size and space is managed depends some on your recovery model. The default recovery model for SSISDB is full. That means you need to be doing regular log backups to keep the size manageable. It may be a few times a day or every five minutes - it really depends on the amount of activity for that database. If your package aren't changing frequently, it may be better to have the recovery model as simple. It depends on your recovery needs as to whether it should be full or simple.
Then in terms of the space, as I said, you would want to manage that by files. And make sure you need to shrink a file as it has grown way out of whack, won't happen again and won't grow to the size it's at now. So if that's the case, you would want to check the space usage of the individual files. A quick and dirty way to just check the SSISDB database would be something like:USE SSISDB
GO
SELECT DB_NAME() AS DBName,
[name] AS FileName,
CAST(size/128.0 as DECIMAL(10,2)) CurrentSizeMB,
CAST((CAST(size/128.0 as DECIMAL(10,2)) -
CAST(FILEPROPERTY(name,'SpaceUsed')AS INT)/128.0) as DECIMAL(10,2)) AS FreeSpaceMB
FROM sys.database_files;
If you do have a file that is excessively large and doesn't need to be that size, you would then use DBCC Shrinkfile to reduce the size of that file.If 16GB is large for your company and you have too much space taken up by that then you could likely be incredibly low on drive space. That in and of itself is likely a risk to the company. But if you have to get that space back, you can use that script to see which file has all the free space. If it's the log then you likely need to schedule regular log backs and would also want to evaluate as to whether simple recovery works better for this database.
Sue
Hi Sue
Thanks for your response - there are a few articles linked to from the one you mention about not shrinking databases - so that will take me some time to read through. I've run the code you provided and get the following results:
SSISDB (data) = 21070MB, of which 17974MB free.
SSISDB (log) = 4976.44, of which 4828MB free.
I'm not a database expert so am learning things as I go unfortunately. We started here with about 15 databases, using the full recovery model. Every database had a maintenance plan which ran every night and it shrank each database then re-built the indexes. We have a 1Tb drive exclusively for our DB backups which should be plenty (currently only 400Gb us used for a rolling 30 day set of backups). The backup drive filled up about a year ago and I found out this was because the databases were only getting full backups but no transaction logs, so these were free to grow until the drive filled up. I set up transaction log backups every 2 hours throughout the day, and kept all backups for a rolling 30 day period. This has been keeping the backups under control until recently when the drive filled up again - I found out this time it was because of this SSISDB, which it appears was created automatically with a recent upgrade to SSMS 2014 and it seems to store job history. This was set to the default of 365 days retention, and over the year grew to 27Gb, backing that up every day filled the backup drive again. I have heard many people say don't shrink databases, however I'll admit I've never seen any problems because (a) our previous maintenance plans rebuilt the indexes after shrinking every database each night, and (b) we've not seen any problems that could be attributed to the shrinking of databases. Not being a DBA expert (non of use here are), we also leave most database options at their default settings which include auto growth settings. (data is set to 1Mb unrestricted growth, and logs are set to 10% growth). We do need to research in this area as I think there is much more to it.
I do not understand why running the shrink database task is not getting rid of this 'free' 16Gb of space held in the database and 4Gb in the log file even when I set it to 'return free space to the operating system'.
Rob
September 20, 2017 at 12:48 pm
Note that the transaction logs can't be shrunk past the active portion of the log. You can execute dbcc loginfo in the SSISDB, look for the status of 2 (that's active) and if it's near the top, you won't shrink much. It's the part of the log with active transactions needed for recovery. You can do a log backup if needed which normally will allow you to shrink more of the file - you will see the active portion of the log further down in dbcc loginfo.
SSISDB gets created manually so somehow someone somewhere created that. You really want to understand the key for the database as that's important. You can find that information as well as additional in this documentation:
SSIS Catalog
And you are absolutely correct that there is a lot more to it. Just to start, you will want to:
Figure out what the appropriate recovery model is for your databases.
Change the growth factors for the files from percent to a fixed MB amount.
Learn more about the log file, recovery models, and backup plans
Understand SSISDB
This should help in terms of how the log works and explains the active portion of the log.
SQL Server Transaction Log Architecture and Management Guide
These go over some of the autogrowth increments:
Blitz Result: Percent or 1MB Growth in Use, Slow File Growths
Autogrowth - Percentage Use?
Sue
October 16, 2017 at 3:38 am
Hi Rob,
I agree with Sue, Shrinkfile allows you to specify the log file (so does the GUI, but you still have the temptation, when you just want a damn solution, to shrink everything and worry about it later). Resist that, just shrink the log repeatedly with as small an increment as you need to make progress.
Fundamentally and mid-term though, why do you need to keep so much data about your job processes? I've consulted at some pretty sensitive, audit-heavy environments and not had to keep anything like 25gb of such data. I've set up a couple of SSIS servers performing at least a couple of hundred transformations per day, and would have been impressed if the history log came up to 2.5Gb, let alone 25
I suspect one of two things is happening; whoever set this up either left the 'Log Everything' box ticked, (not saying there IS such a box, I'm guessing!!) where in fact you probably only needed to keep a record of whether a job failed or succeeded every day, which shouldn't come CLOSE to 25 gb, or in fact for audit/compliance purposes you do need to keep such verbose data.
If the latter, you need to give some thought to biting the bullet, which shouldn't be painful as you have a clear and present need and thus business case, and provisioning extra storage, wherever that may be.
If the former, you need to rationalise both the depth and age of data you're keeping, with emphasis now on the depth. I think the forum has an SSIS subsection, might be worth a fresh post there if you don't feel confident about poking around the logging settings.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply