June 7, 2012 at 8:48 am
I have a couple of packages with a ForEach Loop that appears to be cycling through twice on one file when it should only be cycling through once. It doesn't do it all the time, but when it does do it, I never catch it in the act and can't figure out why it does what it does.
We've narrowed down a potential suspect. We think, maybe, that the File System Tasks we use (which connects to a NAS) isn't moving / renaming files fast enough. That the loop starts a second run based on the filename right before the final step of the Loop has finished processing on the NAS side of the equation.
The problem is, I have no proof, and it sounds like a ridiculous theory. The loop shouldn't start over until it gets confirmation from the drive that the FST has succeeded or failed, right?
Any thoughts?
June 7, 2012 at 2:58 pm
Is the entire dataflow configured inside a Sequence container?
June 8, 2012 at 12:09 am
I think the SSIS only check file once, however you could use process monitor to check.
June 8, 2012 at 12:18 am
Enable logging on preexecute on postexecute events, so you can exactly see what is going on.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
July 6, 2012 at 5:40 am
andersg98 (6/7/2012)
Is the entire dataflow configured inside a Sequence container?
No. It's configured in the ForEach container.
July 6, 2012 at 5:40 am
Koen Verbeeck (6/8/2012)
Enable logging on preexecute on postexecute events, so you can exactly see what is going on.
I think I shall do just that. Thanks, Koen.
July 6, 2012 at 5:48 am
Brandie Tarvin (7/6/2012)
Koen Verbeeck (6/8/2012)
Enable logging on preexecute on postexecute events, so you can exactly see what is going on.I think I shall do just that. Thanks, Koen.
Hmm. Except it appears to be doing that already by default. Here's what I'm seeing for the PreExecute Logging. Is this what should be seen with it enabled or is there more detail that would be printing if it were enabled?
Last Week's Log File
Info: 2012-06-30 08:55:36.31Code: 0x40043007
Source: Load XML File SSIS.Pipeline
Description: Pre-Execute phase is beginning.
End Info
Progress: 2012-06-30 08:55:36.31
Source: Load XML File
Pre-Execute: 0% complete
End Progress
Progress: 2012-06-30 08:55:36.33
Source: Load XML File
Pre-Execute: 33% complete
End Progress
Progress: 2012-06-30 08:55:36.50
Source: Load XML File
Pre-Execute: 66% complete
End Progress
Progress: 2012-06-30 08:55:36.50
Source: Load XML File
Pre-Execute: 100% complete
End Progress
Info: 2012-06-30 08:55:36.50
Code: 0x4004300C
Source: Load XML File SSIS.Pipeline
Description: Execute phase is beginning.
End Info
--Post Execute Log
Info: 2012-06-30 08:55:40.27
Code: 0x40043008
Source: Load XML File SSIS.Pipeline
Description: Post Execute phase is beginning.
End Info
Progress: 2012-06-30 08:55:40.27
Source: Load XML File
Post Execute: 0% complete
End Progress
Progress: 2012-06-30 08:55:40.27
Source: Load XML File
Post Execute: 33% complete
End Progress
Progress: 2012-06-30 08:55:40.27
Source: Load XML File
Post Execute: 66% complete
End Progress
Progress: 2012-06-30 08:55:40.27
Source: Load XML File
Post Execute: 100% complete
End Progress
Info: 2012-06-30 08:55:40.27
Code: 0x4004300B
Source: Load XML File SSIS.Pipeline
Description: "component "StagingFeed" (140)" wrote 91181 rows.
End Info
I think I need to append to the log file this week instead of overwrite with each execution.
July 6, 2012 at 5:58 am
So you see more clearly what I'm working with, I've attached a snapshot of my SSIS package with a few sensitive things blurred out.
The job runs and sends one to two emails. The email in the Send & Archive container is always sent. The Error Email only gets sent if records have data issues.
Once those emails get sent out, we receive a SQL Agent Job Failure notification as if the entire package tried to run again. Last week, it ran for 19 seconds the "second time around." It errored, I believe, because there was no XML file to find. But it shouldn't have restarted any of the ForEach containers unless it thought there was a file there.
The first container "Archive Existing Files" was created to clear our Inbound folder in case of a file not getting archived properly the last time the job ran. Usually the package skips right over that step with a "File Not Found" information message.
July 9, 2012 at 12:22 am
Hmmm, I thought Pre/PostExecute would give more info about the various iterations of the loop. Apparently not.
Looking at the package design, everything seems OK, so I don't really have a clue what's going on.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
July 9, 2012 at 5:06 am
Oh, for the love of ...!#(*%$#W$
I think I just found out what the issue is. EDIT: Forget the screenshots.
I have a file that comes over from a vendor named MyFile 20120709.xml (date embedded after the filename). The first step inside the loop renames the file to MyFile.xml.
The File section of the ForEachLoop collection had MyFile*.xml. So even after the file was renamed, it fit this condition, hence the second run of the container.
It should have been MyFile *.xml. And my testing seems to indicate I'm right on this.
The bugger about this whole issue is that it wasn't happening all the time.
July 9, 2012 at 5:24 am
These are the types of bugs that can drive you really mad 🙂
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
July 9, 2012 at 5:27 am
Koen Verbeeck (7/9/2012)
These are the types of bugs that can drive you really mad 🙂
Tell me about it!
Sure enough, same issue on the other packages that are having the exact same random error.
July 9, 2012 at 5:39 am
It sure is a good start for the workweek, fixing that impossible bug 🙂
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply