January 16, 2008 at 2:35 pm
I have a simple SSIS project.
Step 1: Execute SQL Task
Step 2: Execute SQL Task
Step 3: Data Flow Task (contains three data imports)
Step 4: Execute SQL Task
However when I test execute my package it enters the Data Flow Task, completes all the imports, and then sits there. The output log says
Information: 0x40043008 at Import, DTS.Pipeline: Post Execute phase is beginning.
In the progress tab I see
Progress: Post Execute - 10 percent complete
I've only imported about 120,000 records. Why would my Post Execute task get stuck?
January 16, 2008 at 3:52 pm
Bit more information if it can help.
I used DataReader Sources in my Data Flow portion. I had to do this as the data is coming from an ODBC connection to a database from a proprietary database that only comes with an ODBC driver and no support for OLEDB.
As the SSIS is in a test phase the steps are as follows.
1. SQL 1: A loop through systables dropping all the user tables from the database
2. SQL 2: Creation of the tables to put data in
3. Data Flow: Three tables (so far) imported from ODBC to SQL Server Destinations
4. SQL 3: Creation of more tables and movement of data from the import.
Each of the Data Flows has a success and a failure path, the failure paths convert the NTEXT fields from the ODBC proprietary database to TEXT fields (the data is actually text, but SSIS is treating it as NTEXT) and then outputs the data to a csv file for further analysis.
What I don't understand is all the Data flow elements are green (success and done) but the data flow element on the flow control tab is yellow (still running) and the debug window says that it has entered the Post Execution Phase.
I don't have a defined Post Execution phase, what is it trying to do?
January 17, 2008 at 9:07 am
Well I hope somebody has an idea...
Here's my latest attempt. I figured that since I had a single data flow set (Datareader -> SQl Server) that was actually failing some records that it might be it. So I created two Data Flows.
Data Flow 1 has all the success Datareader -> SQL Server pairs. Data Flow 2 has the one that generates 300 errors out of 103,000 records and writes them to a CSV.
Data Flow 1 executes first and then goes to Data Flow 2. When I test run now, the SSIS processes the two sql statements, then the first Data Flow with no problems. It then hangs on Data Flow 2 where the error handling is at.
I have set the Data Flow 2 to have 2,000,000,000 allowable failures and set Failpackageonfailure and Failparentonfailure to both be false.
Is there another failure handling setting that I'm missing in the configuration that should impact this?
October 1, 2008 at 10:49 am
Hi Mark,
Any luck on this? I'm experiencing the same issue where the Post Execute and Cleanup stages are taking a very long time to execute, even though the Data Flow containers are all showing green...
October 1, 2008 at 2:22 pm
HI mark in the data flow task instead of connecting the same OLE DB Source and connecting to the flat file and OLD DB Destination take two OLE DB Source tasks connect one to Flat file and connect the other to OLE DB Destination and try.
This might work
October 1, 2008 at 2:50 pm
After a lot of work on this project... I eventually gave up...
I either had one of two problems.
Since I was working on a proprietary data source
1. corrupted data in the source (most likely)
2. a bad ODBC driver (not as likely as it worked for one data set, but not the other).
In either case, I was unable to open the data source with Access, Excel, Crystal Reports, and SSIS would hang on it.
At that point I was tasked with other projects... I've since given up on this.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply