November 8, 2011 at 3:50 pm
Hi all,
I was trying to develop a script that will delete old backup files in a backup device/file directory if
1) backup files that older than 7 days
2) backup folder contain more than 2 backup files
(each database has its own subfolder)
The script need to run daily on the backup device server, which is not a SQL server.
The reason for requirement 2) is that if a backup job failed for a few days, we don't want to delete the old backup files.
The requirement 1) is easy..but I don't know how to implement 2)..
November 10, 2011 at 1:50 pm
You can pipe the results of Get-ChildItem to Measure-Object to get a count of files.
Something like this:
(Get-ChildItem -path "C:\DBBackups\" | Where-Object {!$_.PSIsContainer} | Measure-Object).Count
- Jeff
November 10, 2011 at 2:00 pm
Thanks for the reply.. so far, I have work through the following step
Get-ChildItem $Path -recurse | Where {!$_.PSIsContainer} | Group Directory | Where {($_.Count -gt 3) }
How do I pipe the result out and then check the date of the files in each directory?
Thanks
November 10, 2011 at 3:10 pm
I figure out by my self.
The following will delete old files under a path $Path recursively, if the file count in a folder is great than $CopytoKeep and files are older than $DatetoDelete.
Get-ChildItem $Path -recurse | Where {!$_.PSIsContainer} | Group Directory | Where {($_.Count -gt $CopytoKeep) } | ForEach-Object{ Get-ChildItem $_.Name | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item -force }
January 17, 2012 at 2:16 pm
can you specify with example ? i am working on same scenario .
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply