For Each loop used to delete records

  • Hi,

    I currently have a working SSIS package that uses a FOR Each loop container to load data from a csv file to a SQL table (dbo.LSR_Prepare), whilst using a SSIS package variable (fileName) to populate a SQL field (File_Name) with the csv file name and then, if successful, moves the file to an 'Extract Archive' sub folder.

    If unsuccessful (mainly corrupted data), the file will still load rows to the SQL table (until a corrupt record is found), whilst populating a SQL field with the csv file name - Then it moves the file to a 'Corrupt Files' sub folder.

    By linking 'on failure' from the 1st container, I have created a 2nd For Each loop container that loops through the 'Corrupt Files' folder and then, using a script task, I can populate the aforementioned package variable with the csv file name.

    I now need to delete any records from the dbo.LSR_Prepare SQL table where File_Name = fileName (i.e. any csv file found).

    Any ideas?

    Thanks in advance,

  • I would use a Executre SQL Task and pass in the parameter name...

    I would also probably use just the one container with a failure precedence after the dataflow instead of looping through the second folder...

    In the execure SQL you would need to map the parameter and use a statement such as

    delete from tblName where FileName = ?

    this article gives a good explanation of how the Execute SQL Task works including the use of parameters

    http://www.sqlis.com/post/The-Execute-SQL-Task.aspx

  • Many thanks Dave,

    I accessed the link you supplied and took on board your advice on keeping the File System Task within the same 'For Each Loop' container.

    The process works fine as per my original requirements.

    The current process will loop through all csv files found within the network 'Upload' parent folder and

    if sucessful (scenario 1)

  • loads data into an SQL table
  • moves csv file to 'Archive' child folder
  • if NOT sucessful (scenario 2)

  • deletes rows from the SQL table (specificly for the corrupt csv file)
  • moves file to 'Failed' child folder
  • sends email
  • However, I have an additional feature which I need to incorporate:

    If the parent folder has 10 csv files and the 7th file is corrupt, the package stops when loading to the SQL database 'Data Flow Task' object, and places the 7th file in the 'Failed' child folder etc (scenario 2) but doesn't process the 8th; 9th and 10th csv files.

    How do I get the package to continue executing and return to the start of the For Each Loop container?

    Thanks in advance,

  • Are you able to upload a screen shot of the control flow ??

  • Raise the value for the MaximumErrorCount property of your For Each container. By default it is set to 1 which makes the loop stop iterating when an error is encountered.

    Regards,

    Willem
    http://wschampheleer.wordpress.com[/url]

  • Please see attachment. Hope it can be viewed ok.

    Basically it shows the Loop through csv files (for each loop container)

    The first executable object is Get File Name (Script Task) - on success the next object is..

    Load Data to the Database (Data Flow Task) - on success the next object is..

    Move File to Extract Archive Folder (File system Task)

    on failure the next objects are..

    1. Move File to Corrupt Files Folder (File System Task) - on success the next object is

    Delete Rows from SQL (Execute SQL Task)

    2. Send Email if csv File fails(Script Task)

    Thanks,

  • Did you try willems comment - the workflow looks as I would expect it to - wanted to ensure the error parts were in the container and not outside of it.

    D

  • Hi,

    I did try (I was putting my last response together whilst his reply came through) and it is just what I need. But its difficult to judge what to set the property to, as the number of files changes on an hourly basis.

    That's for me to decide!!

    Thanks very much to both of you.

  • Your're welcome.

    Unless you have additional requirements, just set it to a big number e.g. 1,000,000,000.

    Regards,

    Willem
    http://wschampheleer.wordpress.com[/url]

  • Viewing 9 posts - 1 through 8 (of 8 total)

    You must be logged in to reply to this topic. Login to reply