May 2, 2019 at 8:13 pm
Hello,
I created a package that imports csv files into a foreach from a defined directory to a sql server table.
It is possible that in this directory there are some corrupt files that is to say they do not have the right number
of columns waited by the source component of the flat file.
I would like to know how I can do so that all the good files are loaded and the corrupted files moved to another directory.
I used a File System Task component to move corrupted files to another directory but despite that,
the package generates an error.
What would be the solution so that the treatment is done despite the presence of corrupted files.
Thank you.
May 2, 2019 at 8:18 pm
Take a look at a post I wrote on that topic, using the same corrupt file pattern as an example: https://www.timmitchell.net/post/2013/08/05/continue-package-execution-after-error-in-ssis/
Tim Mitchell, Microsoft Data Platform MVP
Data Warehouse and ETL Consultant
TimMitchell.net | @Tim_Mitchell | Tyleris.com
ETL Best Practices
May 2, 2019 at 8:21 pm
Add a task to save the number of columns in the 'current' file before its data flow is executed.
Add a suitable precedence constraint to avoid running the data flow task if the saved column number is different from its expected value.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply