June 11, 2009 at 8:12 pm
Background: We are trying to set up an ETL/Dataload job from an Oracle 10g database to a SQL Server 2008 using SSIS.
Both source and target tables have varchar fields of the same length. The oracle database uses WE8DEC character set. The data provider is Oracle OLEDB.
Previous dataloads from a 9i database work, and now (with what should be the same data) the dataloads from Oracle 10g fail. With the following error:
* Error 0xc02020c5: Data Flow Task 1: Data conversion failed while converting column "KTEXT" (44) to column "KTEXT" (509). The conversion returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
Working through a process of elimination we have focused on 1 table that fails.
To rule out funky data, I created a linked server to the same source database. And selected the data through the linked server into a MSSQL2K8 database on my XP PC. I repeated this same step on our Server (Win Server 2K8 & MSSQL2K8).
To summarise:
I can import data through a linked server from Oracle to MSSQL on both my PC, and to our server.
Trying to use the import wizard/SSIS to import exactly the same data from the same source table, and insert into the same target tables fails in both cases.
In my mind, because it does work via a linked server (without errors), I can rule out Character sets, table structure, and Unusual characters within the data, connectivity, uid:pwd's, schemas, and Windows OS version etc.
All of the forums and posts I have found have reffered to column widths when importing from flat/text files and not from db to db.
Can anybody please tell me why this would happen?
June 16, 2009 at 7:03 pm
We're still working on this...
We've been back through the basic's. Still no success.
June 17, 2009 at 5:24 pm
.
July 8, 2009 at 5:45 pm
The solution. (or at least our work-around.)
It seems that when we migrated the original schema from Oracle 9i to MSSQL using SSMA it created the tables with standard varchar field definitions.
Then we started working with another database which has the same schema definition on 10g. SSIS could detect that the oracle 10g tables are unicode compatible. Even though we where using a non-unicode character set.
Selecting data using TSQL via a linked server works, fine. It seems TSQL does some sort of implicit conversion or is oblivious to this.
The import wizard/DTS/SSIS however is not so happy. To get around the problem. We dropped an recreated the tables using nvarchar fields. And SSIS now happily runs.
I can't say it's a clever or elegant solution, and we've wasted a damn lot of time to compensate for the quirks of SSIS. Once again, Not impressed Bill. Not only is SSIS finicky, the behaviour is inconsistent with TQL and Linked servers. :angry:
So I don't know what this means for everyone else, potentially anyone using SSIS as an ETL tool will need to convert their target tables to unicode. Whether you use unicode characters or not.
July 8, 2009 at 5:46 pm
.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply