December 20, 2006 at 3:15 am
I have tried to import data from a big (257Mb, I has about 1.000.000) flat file into a table, inside SSIS. The process doesn’t work correctly. It stops in the same register (about 179.000). I have reviewed the process and have opened the file making it shorter (about 150.000 lines, including the line that before seems to break the process), and it works. Making the file longer it breaks the process again.
The error showed by the program:
Error: 0xC02020A1 at Data Flow Task, Flat File Source [2820]: Data conversion failed. The data conversion for column "debe" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
Error: 0xC0209029 at Data Flow Task, Flat File Source [2820]: The "output column "debe" (4066)" failed because error code 0xC0209084 occurred, and the error row disposition on "output column "debe" (4066)" specifies failure on error. An error occurred on the specified object of the specified component.
Error: 0xC0202092 at Data Flow Task, Flat File Source [2820]: An error occurred while processing file "Z:\PROYECTOS IT\Soluciones de negocio\Proyectos\SANDO\Proyecto 5. Codificación\ETL\capun00108.unl" on data row 36804.
Error: 0xC0047038 at Data Flow Task, DTS.Pipeline: The PrimeOutput method on component "Flat File Source" (2820) returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "SourceThread0" has exited with error code 0xC0047038.
Error: 0xC0047039 at Data Flow Task, DTS.Pipeline: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0xC0047039.
Anybody knows what could be happening?
Any help would be very appreciated
Thanks
PD: I mean that it´s not a date format problem (I have reviewed some post talking about it)
December 20, 2006 at 3:37 am
Hi,
It sounds like the problem is in you source data. It might caused by having commas or quotes in the content of your source data which break the definition file that the data flow task is expecting.
Regards
Daniel
December 20, 2006 at 9:30 am
Thanks for your answer!
I don´t think that having data with commas breaks the process, in other data flows I also have it and it works. The file separates de data columns by "|" .
December 20, 2006 at 9:46 am
I would still be looking at the data in the source file from the problem. Try opening the text file and navigating to line 36804 where it is failing and examine the record.
I use this great litte app to help me read large text files that notepad can't handle.
http://www.topshareware.com/Large-Text-File-Viewer-download-11000.htm
Regards
Daniel
December 21, 2006 at 3:50 am
I think that my predecessor is right in assuming that the error comes from the source data.
The error message about 'loss of data' is a strong hint.
You have another alternative in finding the row and values causing your problem when you configure the error handling of your data access component in order to route the erroenous records to another data source.
You might want to write them to a CSV or TXT file and examine them there.
You will find out whether the lenght of your file may cause a problem, as the process should continue then with the other records after the one causing the problem.
Regards,
Michael
July 12, 2011 at 12:39 pm
I ran into this problem as well.
The row giving me issue happened to be the last row of the table.
The .txt file contained a custom delimiter at the end of each row.
Removing the end-row delimiter from the last row fixed it.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply