November 30, 2012 at 11:20 pm
i need to delete some backup files as per daily operation through SQL ,currently i am using xp_cmdshell ,is there any alternative for xp_cmdshell so i dont need to enable the xp_cmdshell on all servers?
December 1, 2012 at 2:07 am
You could write a powershell script to delete the files and schedule this with the SQL agent or windows task scheduler.
December 1, 2012 at 5:39 am
do you have to do it through SQL server? why not use a VBS script and MS task scheduler
***The first step is always the hardest *******
December 1, 2012 at 6:08 am
Here's an alternative. Sneaky, but it works.
December 1, 2012 at 9:23 am
Thanks all, any idea what maintenance cleanup schedule uses in SQL server to clean up back up files?
December 1, 2012 at 9:34 am
You can use the 'undocumented' xp_deletefile which is called from the maintenance plan cleanup task. Since it is undocumentated - it may not work in future versions, but so far it has worked well for me from 2005 through 2008 R2.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
December 1, 2012 at 12:00 pm
Or, instead of writing your own backup and maintenance scripts, you could use the ones by Ola Hallengren.
Highly recommended.
December 2, 2012 at 8:04 am
Richard Fryar (12/1/2012)
Or, instead of writing your own backup and maintenance scripts, you could use the ones by Ola Hallengren.Highly recommended.
Do you use it, Richard?
--Jeff Moden
Change is inevitable... Change for the better is not.
December 2, 2012 at 11:40 am
Hi Jeff,
Yes I do use it. It's good to find such well written and supported code provided free to the community.
I should have mentioned that "other scripts are available" as the BBC would say 🙂
December 3, 2012 at 6:40 am
I use this VBS script (it works, please don't tell me that's old school) I wrote to do exactly that.
As explained in the comments, one benefit of this over a simple delete script is that it makes sure that the archive bit on each file has been cleared, which means that the network backup system has backed up that SQL backup file to tape. We occasionally experience times when our network backups hang, and I don't want to delete any SQL backups from disk until they've been copied off to tape.
Without xp_cmdshell, I don't think there's any way within SQL server natively to check the archive bit of a file; you'd need PowerShell, CLR, or some such.
BTW, I have a separate Agent job that runs after this job does. If it finds any backup files created before the nightly system backup last ran, with an uncleared archive bit -- in other words, a file that should have gone off to tape last night but did not -- it sends me an email alert that the network backup may have had problems.
' Find all backup files in the SQL backup folder we can safely delete.
' "Safely delete" means:
' - file creation date older than DaysToRetain days AND
' - the archive bit has been cleared (meaning they've gone to tape)
' Created 1/7/2011 as an alternative to the SQL Maintenance Plan task so that
' we can not only check that the backups are more than x days old, but also check
' that the archive bit has been cleared before deleting them!
' Too many tape backup failures...
' ======================================================================================
'MODIFICATIONS
' ======================================================================================
'2012-09-18: Changed path to new HD installed on server
' ======================================================================================
' How many days should we keep?
DaysToRetain = 5
Set fso = CreateObject("Scripting.FileSystemObject")
Set oFolder = fso.GetFolder("E:\SQLBackup")
' ======================================================================================
' Note the bitwise comparison: is Archive Bit not set (i.e., cleared).
' Don't use parentheses after "Not" or the script fails....
For Each f In oFolder.Files
If (LCase(Right(f.Name, 3)) = "bak" Or LCase(Right(f.Name,3)) = "trn") And _
f.DateCreated < DateAdd("d", (-1 * DaysToRetain), Date) And _
Not f.Attributes AND 32 Then
f.Delete
End If
Next
HTH,
Rich
December 3, 2012 at 7:50 am
rmechaber (12/3/2012)
I use this VBS script (it works, please don't tell me that's old school) I wrote to do exactly that.
It IS "old school"... and I love it 'cause I'm "old school"! 🙂
I don't delete Point-in-Time log files by date... I do it by LSN comparison based on the LSN's of the BAK files that I leave behind.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 3, 2012 at 8:19 am
Jeff Moden (12/3/2012)
rmechaber (12/3/2012)
I use this VBS script (it works, please don't tell me that's old school) I wrote to do exactly that.It IS "old school"... and I love it 'cause I'm "old school"! 🙂
I don't delete Point-in-Time log files by date... I do it by LSN comparison based on the LSN's of the BAK files that I leave behind.
Hah, OK, Jeff, I'll buy that!
But do you also check the archive bit? I virtually never see anyone's online tips for culling SQL backup files discuss this point, but to me it's vital. What good is a well-tested SQL backup process if you don't know that the files actually went off to tape?
I got interested in this b/c I used to be a sysadmin responsible for, among other things, the server backups, so I know how buggy backup software can be. "Sure, we can restore that SQL database, just lemme pull the files .... off... this.... tape...........:crying: "
Rich
Rich
December 3, 2012 at 5:18 pm
Yes, and for the very same reason you stated. If the archive bit hasn't been set to 0, then the file hasn't been backed up to tape and must not be deleted. If I end up with a file count of more than 2 for unarchived BAK files, then I put an extra band on the ol' porkchop launcher and head for the folks over in the Infrastructure group that do the tape backups.
As a sidebar, I don't use VBS for any of that (not that it's a bad way... I just do it "differently"). XP_CmdShell to a protected "DIR" with output to a temp table to control the whole thing does just fine for me.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 10, 2012 at 7:33 pm
Thanks much, what is the alternative for reading the file from sql if i dont want to use xp_cmdshell , i am using below
--Selecting all the backup files from backup Path
Select @sql = 'dir /B ' + @Path
--Inserting all the existing backups in Temp Table
insert#ExistingBackups exec master..xp_cmdshell @sql
December 10, 2012 at 9:08 pm
sqlquery-101401 (12/10/2012)
Thanks much, what is the alternative for reading the file from sql if i dont want to use xp_cmdshell , i am using below--Selecting all the backup files from backup Path
Select @sql = 'dir /B ' + @Path
--Inserting all the existing backups in Temp Table
insert#ExistingBackups exec master..xp_cmdshell @sql
--===== Create a work table to store the results from an EXEC
CREATE TABLE #FileInfo
(
RowNum INT IDENTITY(1,1) PRIMARY KEY CLUSTERED,
DirFileName VARCHAR(128),
Depth INT,
IsFile BIT
)
;
--===== Create and populate the desired path variable
DECLARE @Path VARCHAR(500);
SELECT @Path = 'C:\';
--===== Get the file information and store it in the temp table.
-- The first "1" controls the file depth. 1 is current path only.
-- The second "1" says to include file names as well as directory names.
INSERT INTO #FileInfo
(DirFileName, Depth, IsFile)
EXEC xp_dirtree @Path,1,1
;
--===== Select just backup files (presumably, *.bak).
SELECT RowNum,
BackupFileName = DirFileName
FROM #FileInfo
WHERE IsFile = 1
AND DirFileName LIKE '%.bak'
;
Note that xp_DirTree is not a supported command but I've been using it for a very long time.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 15 (of 16 total)
You must be logged in to reply to this topic. Login to reply