August 23, 2018 at 5:04 am
The following script embedded in a package works in the IDE but when deployed and run from an Agent Job no files get deleted, the same parameters are being used in both cases. The Agent Job owner has full control access rights to the folder.
The following parameters values are used:
pathTargetDirectory = "E:\Test Files\Processed"
fileFilter = "InvoicedFOC*.csv"
retentionPeriod = 30
public void Main()
{
string pathTargetDirectory = Dts.Variables["$Package::pathTargetDirectory"].Value.ToString();
string fileFilter = Dts.Variables["$Package::fileFilter"].Value.ToString();
int retentionPeriod = int.Parse(Dts.Variables["$Package::retentionPeriod"].Value.ToString());
DateTime fileCreatedDate;
DateTime retentionDate;
DateTime today = DateTime.Today;
DirectoryInfo dirInfo = new DirectoryInfo(pathTargetDirectory);
foreach (FileInfo f in dirInfo.GetFiles(fileFilter, SearchOption.TopDirectoryOnly))
{
fileCreatedDate = f.CreationTime.Date;
retentionDate = fileCreatedDate.AddDays(retentionPeriod).Date;
if (today > retentionDate)
{
try
{
f.Delete();
}
catch(Exception ex)
{
Console.WriteLine(ex);
}
}
//MessageBox.Show(f.CreationTime.ToString());
}
Dts.TaskResult = (int)ScriptResults.Success;
}
August 23, 2018 at 5:11 am
The Agent job owner is not the right place to look. Instead, look at the context in which the job is being run – the SQL Agent service user, unless you have set up a proxy.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
August 23, 2018 at 5:28 am
Phil Parkin - Thursday, August 23, 2018 5:11 AMThe Agent job owner is not the right place to look. Instead, look at the context in which the job is being run – the SQL Agent service user, unless you have set up a proxy.
It was permissions, I had created the original test folder some weeks ago and created another yesterday. My original tests in the IDE were using the folder created some weeks ago where permissions were set correctly. I had missed setting them in the new folder. DOH! I had added the Catch Block to allow the process to continue if an error occurred because of a file lock as I did not want such an error to stop the process for the remaining files. All is good now. I will just go lie down for a while now and contemplate life 🙂
August 23, 2018 at 5:41 am
tim.ffitch 25252 - Thursday, August 23, 2018 5:28 AMIt was permissions, I had created the original test folder some weeks ago and created another yesterday. My original tests in the IDE were using the folder created some weeks ago where permissions were set correctly. I had missed setting them in the new folder. DOH! I had added the Catch Block to allow the process to continue if an error occurred because of a file lock as I did not want such an error to stop the process for the remaining files. All is good now. I will just go lie down for a while now and contemplate life 🙂
I'm familiar with that feeling 🙂 Glad you got it sorted.
You could easily refine your CATCH block to specifically trap the case where the file is locked and fail the job in all other cases, should you choose to do so.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply