May 31, 2016 at 10:24 am
Guys,
I have inherited a package that does the following:
SQL: EXEC Triggers a Data Driven subscription in SSRS (see below)
SCRIPT: Waits for a given length of time for the subscription to generate the PDF files into the folder
Gets a list of the operatives that should each have a file - saved to ResultSet
FOR EACH OPERATIVE in ResultSet
SQL: code that Generates the FullPathName to the file for the specific operative and saves the details to a set of variables (single values)
Lookup: Looks the operative up to get their Social Security number from a separate unlinked database
EXEC Process: Calls an encryption program to encrypt the PDF with the SSN
Move: the encrypted file to another folder
Delete: the original file
Lookup: Looks up the operative email address in another unlinked database
SendMail: Sends the encrypted file to the Exchange server with the operatives email address in the To: field
Deletes the encrypted file.
The problem is that files may take longer than the prescribed time to be generated depending on server load. If a file is missing then the process crashes because details of non-existent files are send to the encrypter.
I know how many files there should be (but it varies week-on-week). Is there a way to count the number of files generated and only proceed to the looping if all the files are there. In a previous life I would have used Konensans file watcher but I don't have the necessary credentials to get it installed.
May 31, 2016 at 10:34 am
i've got a similar process, where i need exactly 17 files in a folder that is cleared out daily.
a simple script task to count the files what i use;
string WorkingFolder = (string)Dts.Variables["WorkingDirectory"].Value.ToString();
string[] AllFiles = Directory.GetFiles(WorkingFolder, "*.csv");
Dts.Variables["AllFilesExist"].Value = (bool) (AllFiles.Length ==17);
Lowell
May 31, 2016 at 10:57 am
Why depend on two asynchronous processes?
Can the process that generates the files log when it's done, and then the process that uses the file checks that log? Make it an actual workflow instead of two processes?
If that's not possible, PowerShell or any .NET scripting component can check files against a provided list and validate their existence.
But even with that, a large file can "exist", but still be in-flight on being written. If you start a second process on a file that's still being written to, you can end up with corruption or other errors.
That's why an actual workflow is definitely better. If it can be done.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
June 1, 2016 at 6:18 am
@G^2
I hate the way the package process has been designed precisely because of the ASYNC nature of the initiating process. Without a rewrite, the short answer is no. The SQL statement writes a record to the SSRS subscription event table which is then consumed internally by the reporting services windows service so the report generation is completely divorced from the package execution. I quite like the idea of the script to count the files.
Alternatively I should know the last file to be produced so I could just use WMI to check for its existence.
June 1, 2016 at 9:08 am
Aaron,
I had to do something like this.
Rather than checking the file system, I checked the status of the report subscription.
I used ReportServer.dbo.AddEvent to fire the subscription and then checked the LastStatus in ReportServer.Subscriptions table to determine if the report had completed.
You would then be able to wait for each subscription to complete (long or short) which guarantees that the file will exist.
Jez
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply