November 8, 2013 at 1:31 pm
Hello Everyone!
I am a novist in T-SQL programming. I have an urgent task, as part of the maintenance plan, described as folllows:
Write a script that will delete files from a specific folder every 15 days and send a report after the action is completed.
I came up with the following script, but it does not seem to do the work:
------------------------------
DECLARE @Date nvarchar(10)
DECLARE @Time nvarchar(12)
DECLARE @DTime nvarchar(23)
DECLARE @ShelfLife int
SET @ShelfLife = 15-- How long to keep backup files (15 days)
SET @Date= CONVERT(date,GETDATE()-@ShelfLife,109)
SET @Time= CONVERT(time(0),GETDATE(),108)
SET @DTime=@Date + 'T' + @Time
EXECUTE master.dbo.xp_delete_file 1,N'E:\BACKUPS\MaintenancePlanReports',N'txt',@DTime,1
--------------------------------
Any suggestions is highly appreciated.
Thanks in advance.
November 8, 2013 at 1:42 pm
Hm. I could be wrong on this, but I believe xp_delete_file will only work on SQL Server backup and report files; that is, files ending in .trn, .bak, and .rpt. It looks like you're trying to delete .txt files with it, which shouldn't work. You may have to resort to using a PowerShell script to do this instead.
- 😀
November 8, 2013 at 1:53 pm
Thank you for your reply and for your suggestion.
I will try and see what this suggestion results to.
Thanks again.
November 8, 2013 at 2:14 pm
I am sorry, but I could not find enough information on what I am trying to accomplish. Here is what I tried, but without success:
EXEC master..xp_cmdshell 'del C:\file.txt'
November 8, 2013 at 3:07 pm
Ah, you'll probably need to create an Agent job and specify it as a Powershell type. Using xp_cmdshell may work, but it doesn't have the flexibility of Powershell. Here's the script I use to clean up transaction log backups older than 14 days, altered a bit:
$a = Get-ChildItem "<drive>:\<folder>" -recurse
foreach($x in $a)
{
$y = ((Get-Date) - $x.CreationTime).Days
if ($y -gt 14 -and $x.PsISContainer -ne $True)
{$x.Delete()}
}
Replace the <drive> and <folder> parts as needed with your filesystem locations, and adjust <if ($y -gt 14) to your desired number of days.
Keep in mind this will delete ALL files older than the specified number of days in the provided location, regardless of type; if you need file-extension filtering, I may be able to alter it as needed. Also, of course, test this in a disposable directory to make sure it's doing what you need it to.
- 😀
November 8, 2013 at 4:41 pm
What about using a vbs script executed by windows scheduled task? I use this method and write out my results to text file.
***The first step is always the hardest *******
November 8, 2013 at 4:47 pm
hisakimatama (11/8/2013)
Using xp_cmdshell may work, but it doesn't have the flexibility of Powershell....{snip}...
if you need file-extension filtering, I may be able to alter it as needed.
BWAAA-HAAA!!!! Too funny. While Powershell may be able to do certain things that DOS cannot, DOS (via xp_CmdShell) sure does make it look easy to me (notice... I have not tested this script)... even easier than the Powershell script. This one even has filtering for the extension. 😉
forfiles -p "C:\SomeFolder" -s -m *.txt -d -14 -c "cmd /c del @path"
Shifting gears a bit, there are two things that caught my eye on the original post.
First the title says "Delete Reports", not .txt files. Second, the requirement is to do this from a Maintenance Plan and the related created job. If both are true, there there's no need for either PowerShell or xp_CmdShell. All of the necessary features are avaliable by clicking and dragging in the MP or a simple selection in the job as is the ability to send a notification email for pass/fail, etc.
Considering the wording of the original question, I'm thinking this is either a question on a test or a question on an interview. So, my question to the OP would be, have you tried doing it from a Maintenance Plan because that's what they actually want to see. There is no code required for this.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 9, 2013 at 6:27 am
Huh, so doing that via command-line syntax is possible! Guess I slipped up on that one :-P. I seem to recall trying to do something similar via cmdshell before, and I didn't have a bit of luck with finding information on it. Clearly, I needed more Google-Fu :-D. Duly noted, Jeff!
- 😀
November 9, 2013 at 8:53 am
I was also going to mention FORFILES. I use it for a similar purpose: Deleting trace files older than X days. I schedule it through SQL Scheduled Jobs.
[font="Courier New"]forfiles /p "i:\tracefiles" /s /m *.trc /c "cmd /c del @path" /d -45[/font]
I think the syntax varies slightly depending on the version of Windows.
November 9, 2013 at 12:53 pm
hisakimatama (11/9/2013)
Huh, so doing that via command-line syntax is possible! Guess I slipped up on that one :-P. I seem to recall trying to do something similar via cmdshell before, and I didn't have a bit of luck with finding information on it. Clearly, I needed more Google-Fu :-D. Duly noted, Jeff!
No problem. I just happen to be a big fan of DOS for doing certain things because the commands are sometimes so very easy to use compared (for me at least) to some of the PowerShell stuff (I gave up "programming" way back in 2002 so things like PowerShell have become a bit strange for me).
As a bit of a sidebar, it's really kind of funny. Everyone said DOS was a "dead language" but the ForFiles command wasn't available until (IIRC) Windows Server 2003. Even XP didn't have it. Adding to that, RoboCopy and a couple of other goodies came out with Windows Vista/Windows 2008 and maybe it's not the "dead language" that everyone thought (I say after crossing heart and, with folded hands, looking pleadingly towards the sky :-)).
--Jeff Moden
Change is inevitable... Change for the better is not.
November 14, 2013 at 7:57 am
I would like to thank each one of you for your replies and suggestions.
No, this is not a test question. We have jobs that are supposed to delete files older than a certain number of days. For some reasons, the job is not working on our text files.
Thank you,
November 14, 2013 at 9:42 am
hisakimatama (11/8/2013)
Hm. I could be wrong on this, but I believe xp_delete_file will only work on SQL Server backup and report files; that is, files ending in .trn, .bak, and .rpt. It looks like you're trying to delete .txt files with it, which shouldn't work. You may have to resort to using a PowerShell script to do this instead.
Simbo_mk (11/14/2013)
I would like to thank each one of you for your replies and suggestions.No, this is not a test question. We have jobs that are supposed to delete files older than a certain number of days. For some reasons, the job is not working on our text files.
Thank you,
Like hisakimatama said in the second post on this thread (quoted above, as well), xp_delete_file won't delete .txt files. It works only for backup and report files and they have to have the correct embedded header, to boot.
As a bit of a sidebar, this demonstrates that great care and very careful research/testing should be done before using any undocumented stored procedure.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 14, 2013 at 2:06 pm
Hell again,
1.) I typed the following Powershell script in my job step:
$a = Get-ChildItem "E:\Backups\MaintenancePlanReports" -recurse
foreach($x in $a)
{
$y = ((Get-Date) - $x.CreationTime).Days
if ($y -gt 14 -and $x.PsISContainer -ne $True)
{$x.Delete()}
}
2.) I executed the job
I am getting the following error as a result:
The corresponding line is '{$x.Delete()}'. Correct the script and reschedule the job. The error information returned by PowerShell is: 'Exception calling "Delete" with "0" argument(s): "The process cannot access the file 'LogBackup-HourlyRun.txt' because it is being used by another process." '. Process Exit Code -1. The step failed.[/font]
Any suggestions???
Thank you!
Simbo_mk
November 14, 2013 at 4:25 pm
Hm. From the error message, it seems someone or something is using the file you're trying to delete. If you accidentally opened it before you deleted it and kept it open (and I know I've done that myself far, far too many times! :-D), close the file and run the script again. Otherwise, see if someone else has access to the file and opened it for some reason.
- 😀
November 15, 2013 at 12:32 pm
Simbo_mk (11/14/2013)
Any suggestions???
Yes... use the DOS command I posted instead. 😛
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 15 (of 16 total)
You must be logged in to reply to this topic. Login to reply