March 1, 2009 at 2:45 pm
Hi,
I am moving a comma delimeted file into Sql Server 2008. The file has errors (quotes within quotes between commas) but thats ok. What I want is for the package not to fail but to handle the error and move on. I have a flat file source and an oledb destination. Here is the output. Any ideas would help. Thanks in advance
SSIS package "Properties.dtsx" starting.
Information: 0x4004300A at Properties Upload, SSIS.Pipeline: Validation phase is beginning.
Information: 0x4004300A at Properties Upload, SSIS.Pipeline: Validation phase is beginning.
Warning: 0x80049304 at Properties Upload, SSIS.Pipeline: Warning: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available. To resolve, run this package as an administrator, or on the system's console.
Information: 0x40043006 at Properties Upload, SSIS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Properties Upload, SSIS.Pipeline: Pre-Execute phase is beginning.
Information: 0x402090DC at Properties Upload, Properties Import [1]: The processing of file "C:\AllTest\properties.csv" has started.
Information: 0x4004300C at Properties Upload, SSIS.Pipeline: Execute phase is beginning.
Error: 0xC0202055 at Properties Upload, Properties Import [1]: The column delimiter for column "Property_Zip_County_Id" was not found.
Error: 0xC0202092 at Properties Upload, Properties Import [1]: An error occurred while processing file "C:\AllTest\properties.csv" on data row 2147.
Error: 0xC0047038 at Properties Upload, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "Properties Import" (1) returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Information: 0x40043008 at Properties Upload, SSIS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DD at Properties Upload, Properties Import [1]: The processing of file "C:\AllTest\properties.csv" has ended.
Information: 0x402090DF at Properties Upload, Properties Export [9]: The final commit for the data insertion in "component "Properties Export" (9)" has started.
Information: 0x402090E0 at Properties Upload, Properties Export [9]: The final commit for the data insertion in "component "Properties Export" (9)" has ended.
Information: 0x4004300B at Properties Upload, SSIS.Pipeline: "component "Properties Export" (9)" wrote 1900 rows.
Information: 0x40043009 at Properties Upload, SSIS.Pipeline: Cleanup phase is beginning.
Task failed: Properties Upload
SSIS package "Properties.dtsx" finished: Success.
March 1, 2009 at 3:54 pm
You could add another destination for the errors. For example, add a flat file destination, connect the red line from the source to that and reconfigure the Error Output on the source to redirect rows instead of failing.
--> [OLEDB Destination]
[Flat File Source]--|
--> [Flat File Destination]
March 2, 2009 at 10:49 am
Yes I tried configuring the error output but the Flat file source keeps failing. I'll keep trying to find an alternative method.
March 2, 2009 at 8:24 pm
How did you configure the error output? Did you change it to redirect or ignore for every column?
You could post your .DTSX file and a sample input file that includes good and bad entries. Someone might be able to offer a suggestion from that.
March 3, 2009 at 4:45 am
According to your description, you know the flat file has an error but you dont want the import/SSIS to fail.
Therefore you can try,
1. Opening the Flat File Sourse Editor
2. Select Error Out Puts on the left pannal
3. Select the column(s) in question and set to Ignore Failure for Error & Truncation
March 4, 2009 at 10:07 pm
Is Property_Zip_Country_ID the last column in the row. In that case your error might be caused by an Incorrect Row Delimiter and you can fix the error rather than ignore it. Anyway, here's hoping.
March 7, 2009 at 2:47 am
Robert
Why not use bulkinsert to a staging table with one field and then select substring(field,x,y) etc to
get out the individual fields in the textfile.
I started to use SSIS but run into problems like you describe and went back to basic.
Gosta
March 8, 2009 at 10:38 pm
You should configure your Flat File source to the setting "Redirect Rows" at the row level. This can be found under the Error Output tab. The default setting is to fail the component on error, so unless you've changed this, your error output will be ignored and the component will fail when it encounters such an error.
hth,
Tim
Tim Mitchell
TimMitchell.net | @tmitch.net | Tyleris.com
ETL Best Practices
February 16, 2011 at 1:00 pm
This problem seems to be 'eons' old but I will make these comments anyway!
I too have suffered because of inconsistencies in the SSIS code. I now have a standard set up for creating packages. Just like everyone else, I am working in a mixed environment of hardware and software packages, version and releases. I started out receiving the same message that is posted in this forum.
"Warning: 0x80049304 at Vendor Direct Deposit Load, SSIS.Pipeline: Warning: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available. To resolve, run this package as an administrator, or on the system's console."
This apparently relates to SSIS begin unable to capture performance metrics when mixing 64- and 32-bit processes. Most often, I am getting input data from a CA/Datacom 'relational' database table and moving it to a SQL Server 2005 (soon to be 2008) environment. The mainframe host is in the OS390 family. I am loading records into a SQL Server 2005 database setting on a 32-bit machine; but I’m writing the SSIS packages on my Windows 7 64-bit desktop. Previously I was using a Windows XP virtual machine (VM) implementation for writing packages. As soon as I starting using the Win 7 machine, I started getting errors concerning the ODBC drivers and then came the performance counters error in SSIS.
All of the mainframe records, at a minimum, require that its data types be converted from Unicode to non-Unicode format. Now, when I create my packages, I always do the following.
1. Set up the basic package:
a. Read the table
b. Convert the columns
c. Write output to an OLE DB table
2. Next, I go back and add FLAT FILEs to 'potentially' capture output from each of the three (3) phases.
3. Then I go back into each control (read, convert and load) and 'redirect row' for all output into the flat files.
It's not pretty; it adds overhead to the resource load but it works. I'm three for three right now. You will continue to receive the WARNING about global shared memory but your package will complete and the appropriate records will be written. I've become so confident, I have set up a trio of flat files that I reference in every package because nothing gets written to them. I hope you are as fortunate.
One of the self-appointed SQL Server gurus posted information about performance counter inconsistencies in the following blog. I'm probably not going to do what is suggested unless I see a need to change what is currently working for us.
David
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply