April 6, 2010 at 4:04 pm
Currently our SQL process is scheduled to pickup files in their respective directories at a scheduled time.
Majority of our job failures are due to missing files. It's a nightmare during the holidays. (who is sending files and who's not) I would like to come up with a better process which will look for the files and not have to schedule jobs on the SQL side. If the files are not found, send an email to the group responsible for the files and not have the job kick off until the file is present. When the file is dropped in the directory, call the SQL job to run. (completely automated) Has anyone gone through this exercise with their company? The three options below is what appears to be a possible solution, anything other recommendations or advice is greatly appreciated.
Looping over files with the Foreach Loop File Enumerator
File Watcher Task http://www.sqlis.com/23.aspx
Service Broker Messaging
April 13, 2010 at 2:26 pm
We've implemented a solution that involves a home grown .net service that monitors a specific folder waiting on the windows directory changed event.
We tie that service with a couple of meta tables(Jobs, Files and Status) that identifies if the file dropped is associated with a DTS or SSIS package. Another thread of the service queries the meta tables every n seconds and invokes the appropriate task via the sp_OACreate proc to call the appropriate ETL via DTEXEC or DTSRun command with the parameters driven from the meta tables.
April 13, 2010 at 4:33 pm
I've done that before with just a very simple vb.net windows service. All it does is watch a folder and then fire a specific package based on the filename. Very lightweight and reliable.
As you mentioned, this is one of several ways, but even without much .net experience it's pretty easy to find samples and modify for your purposes.
Good luck.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply