December 16, 2008 at 8:55 am
I have an SSIS package that within a loop executes a bulk insert for each of 50+ files in a folder. 99% of the time it executes perfectly. However, on a random basis the process has begun to double the number of rows inserted. This only happens for one file/table of the set. And it is never the same one twice. There is no error thrown and this was discovered by accident.
Has anyone had a similar experience? Or any suggestions about how to prevent this?
Thanks
December 17, 2008 at 6:13 am
Track the file name and execution times in a table somewhere.
I would suspect that your process is picking up a file more than once. A ForEach loop over a folder can easily get the same file twice if someone updates or even opens the file during a long running loop process. I like to move the files at the beginning of a process to a local folder that only the SSIS package has access to - to ensure users do not touch anything wile the package is running.
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply