January 26, 2010 at 4:00 am
Hi,
I currently have a working SSIS package that uses a FOR Each loop container to load data from a csv file to a SQL table (dbo.LSR_Prepare), whilst using a SSIS package variable (fileName) to populate a SQL field (File_Name) with the csv file name and then, if successful, moves the file to an 'Extract Archive' sub folder.
If unsuccessful (mainly corrupted data), the file will still load rows to the SQL table (until a corrupt record is found), whilst populating a SQL field with the csv file name - Then it moves the file to a 'Corrupt Files' sub folder.
By linking 'on failure' from the 1st container, I have created a 2nd For Each loop container that loops through the 'Corrupt Files' folder and then, using a script task, I can populate the aforementioned package variable with the csv file name.
I now need to delete any records from the dbo.LSR_Prepare SQL table where File_Name = fileName (i.e. any csv file found).
Any ideas?
Thanks in advance,
January 26, 2010 at 4:12 am
I would use a Executre SQL Task and pass in the parameter name...
I would also probably use just the one container with a failure precedence after the dataflow instead of looping through the second folder...
In the execure SQL you would need to map the parameter and use a statement such as
delete from tblName where FileName = ?
this article gives a good explanation of how the Execute SQL Task works including the use of parameters
January 28, 2010 at 4:18 am
Many thanks Dave,
I accessed the link you supplied and took on board your advice on keeping the File System Task within the same 'For Each Loop' container.
The process works fine as per my original requirements.
The current process will loop through all csv files found within the network 'Upload' parent folder and
if sucessful (scenario 1)
if NOT sucessful (scenario 2)
However, I have an additional feature which I need to incorporate:
If the parent folder has 10 csv files and the 7th file is corrupt, the package stops when loading to the SQL database 'Data Flow Task' object, and places the 7th file in the 'Failed' child folder etc (scenario 2) but doesn't process the 8th; 9th and 10th csv files.
How do I get the package to continue executing and return to the start of the For Each Loop container?
Thanks in advance,
January 28, 2010 at 4:25 am
Are you able to upload a screen shot of the control flow ??
January 28, 2010 at 4:49 am
January 28, 2010 at 5:27 am
Please see attachment. Hope it can be viewed ok.
Basically it shows the Loop through csv files (for each loop container)
The first executable object is Get File Name (Script Task) - on success the next object is..
Load Data to the Database (Data Flow Task) - on success the next object is..
Move File to Extract Archive Folder (File system Task)
on failure the next objects are..
1. Move File to Corrupt Files Folder (File System Task) - on success the next object is
Delete Rows from SQL (Execute SQL Task)
2. Send Email if csv File fails(Script Task)
Thanks,
January 28, 2010 at 5:39 am
Did you try willems comment - the workflow looks as I would expect it to - wanted to ensure the error parts were in the container and not outside of it.
D
January 28, 2010 at 7:11 am
Hi,
I did try (I was putting my last response together whilst his reply came through) and it is just what I need. But its difficult to judge what to set the property to, as the number of files changes on an hourly basis.
That's for me to decide!!
Thanks very much to both of you.
January 28, 2010 at 8:03 am
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply