Issue with SSIS data conversion

  • I have a tab delimited ascii file containing several fields that contain "date" data in this format:

    2020-04-18T00:00:00

    There are about 10 of them.  The flat file source definition has these fields set as DT_STR(50).

    Next there is a Data Conversion step that converts each of these fields to DT_DBTIMESTAMP

    These field get imported to a OLE DB Destination (SQL 2012), and each field is set as type DATETIME.

    All of this works fine.

    We are now adding 6 more fields to the end of the flat file. 2 are dates, 4 are text. We defined the 2 new date fields  (VOPENDATE_op and VOPENDATE2_op)  in the flat file source the same way - DT_STR(50). We then do a data conversion on the 2 fields, converting them to DT_DBTIMESTAMP (converted field names are VOPENDATE and VOPENDATE2, respectively) . And them mapping them to DATETIME fields in the SQl table.

    However, since adding the new date fields, the job fails:

     

    Code: 0xC02020C5     Source: Res import Data Conversion 1 [2]     
    Description: Data conversion failed while converting column "VOPENDATE_op" (1552) to column "VOPENDATE" (530).
    The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
    End Error

    Error: 2020-04-16 20:17:36.83 Code: 0xC0209029 Source: Res import Data Conversion 1 [2]
    Description: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.
    The "Data Conversion 1.Outputs[Data Conversion Output].Columns[VOPENDATE]" failed because
    error code 0xC020907F occurred, and the error row disposition on
    "Data Conversion 1.Outputs[Data Conversion Output].Columns[VOPENDATE]" specifies failure on error.

    Error: 2020-04-16 20:17:36.83 Code: 0xC0047022 Source: Res import SSIS.Pipeline
    Description: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on
    component "Data Conversion 1" (2) failed with error code 0xC0209029 while processing input
    "Data Conversion Input" (3). The identified component returned an error from the ProcessInput method.
    The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

    I have tried many combinations including setting the data type right in the flat file source definition to DB_TIMESTAMP to no avail. The column data looks clean with no extraneous tabs or commas. I even tried mapping the NEW flat file fields into older date fields in the SQL table. No joy. I have compared the meta data for input and output every step of the way - these new date fields are identically configured to the existing date fields that work fine.

    Any one have any ideas?

    Hoping it's something silly that my tired eyes have missed.

    Thanks!

    • This topic was modified 4 years, 7 months ago by  scotdesort.
  • Can you show us a couple of rows of actual data, indicating which are the new columns and which are those which import correctly?

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Here are 2 records. The first is a record that has data in the new fields and fails. The 2nd is one that imports correctly. If I change the data conversion to IGNORE errors and truncation issues, the 1st record imports into the table (obviously without the data in these 2 new fields). the new columns are the last 6 columns. As you can see, each record has other date fields which are all ok. The new date fields that fail are VOPENDATE and VOPENDATE2

    I pasted the records into a plain text box below, and also uploaded as a TXT file.

    Thanks

    FIELDS:

    AREAAREASHORTAREANAMEBASEMENTBATHSBATHSFULLBATHSPARTBDRMSBSMTDESCCOUNTYCOUNTYDESCCOUNTYIDDATECOEDATEEXPDATEHSDATELASTMODDATELISTDATEOFFDATEPENDDATESTATDATEWITHDTADDDTMODEAYNELEMSCHOOLEXTERIORFIREPLACEHEATCOOLINTERNETREMARKSITEMNOTINCITEMSINCJHSCHOOLLAG1EMAILLAG1IDLAG1NAMELAG1PHONELAG2IDLAG2NAMELAG2PHONELOFFEXTLOFFIDLOFFNAMELOFFPHONELEASEDATELIFESTYLELISTINGIDLOCALELOTDESCLOTDIMENLOTFRONTAGEMISCMLSNUMBERNUMROOMSOPENDATEOPENDATE2OPENINFOOPENINFO2OPENTIMEOPENTIME2PHOTONOPHOTOYNPRICECURRENTPRICEORGPRICESALEPROPSHORTPROPTYPESEWERSHSCHOOLSMACRESSTATUSSTATUSFLAGSTREETSTREETDIRSTREETDIRPOSTSTREETNOSTREETNUMSTREETNUMBERSTREETTYPSTYLESUBDIVSUBSTYLESUBAREATOWNLONGVIEWSEXPOSUREVIRTUALTOURVOWADDRWATERYNWATERFRONTYEARBUILTZIPPCHDATEDATEBOMDATEENTRYGARAGEPRICEPREVSTATEIDASSOCIATIONFLOODPLAINSUBSUBAREADIRECTIONSDOMOWNEROWNERPHONEREMARKSGENERALSHOWINSTRUCTIONSSTATUACOMBOTAXTOURDATETOURKEYATTOURTIMEPRICESOFFNAMEHDPHOTOLATITUDELONGITUDEINTERNETYNSAG1NAMESOFFIDSAG1IDSAG2IDSAG2NAMEVIRTUALTOUR2BRANDEDTOURIDXPOOLTYPECOOLINGVOPENDATEVOPENDATE2VIRTUALOPENTIMEVIRTUALOPENTIME2VIRTUALOPENURLVIRTUALOPENURL2

    RECORD#1:

    2MAHWHMahwahFinished,Full2.00204Finished, Laundry, UtilitiesBERGEN02BERGEN2021-02-12T00:00:002020-03-18T16:12:202020-02-14T00:00:002020-03-18T00:00:002020-02-14T13:24:132020-04-16T15:44:01RBetsy Ross E.S.1 Fireplace,Self-startGas,Hot AirWelcome home to this meticulously maintained ranch located in the Cragmere section of Mahwah! This ranch boasts 4 bedrooms with 2 newly updated baths, a spacious living room, dining room, BRAND NEW modern eat-in kitchen, and a beautiful finished basement. If you're looking for that farmhouse feel - look no further! Gorgeous handmade barn doors lead you into the master bath and plenty of rustic paneling throughout the home helps you relax in serenity. Summertime have your friends and family over for a BBQ and drinks with all the beautiful established perennial gardens in your backyard. Close to schools, shopping, transportation, major highways, the Ramapo Reservation, Campgaw Ski Mountain and more. This home has everything you're looking for...schedule your showing today!See SDSee SDRamapo Ridge M.S.matthew.clark@cbmoves.com1016586Matthew K. Clark(201) 819-058263626Coldwell Banker, Allendale/Saddle River Valley(201) 327-5050Close/Parks,Close/School,Close/Shopg,Close/Trans,Close/Wrshp,1st Flr Mbdrm,One Floor Living20006513600Corner LotHdwd As In20006513823Y537000.00537000.00RES1MunicipalMahwah H.S.ACTIVEAirmount148148148RoadRanchOther0233MahwahNoneYESNone1900-1939074302020-03-18T15:12:20Attached,Gar Opener,2 CarNJNoneNoneFranklin Tpk. To Airmount63Hartigan Karen(201) 819-0582Welcome home to this meticulously maintained ranch located in the Cragmere section of Mahwah! This ranch boasts 4 bedrooms with 2 newly updated baths, a spacious living room, dining room, BRAND NEW modern eat-in kitchen, and a beautiful finished basement. If you're looking for that farmhouse feel - look no further! Gorgeous handmade barn doors lead you into the master bath and plenty of rustic paneling throughout the home helps you relax in serenity. Summertime have your friends and family over for a BBQ and drinks with all the beautiful established perennial gardens in your backyard. Close to schools, shopping, transportation, major highways, the Ramapo Reservation, Campgaw Ski Mountain and more. This home has everything you're looking for...schedule your showing today!BOM. 24 HR Notice NEEDED. Call/Text LA Matt with any questions 201-819-0582A 8099537000.00No41.088637-74.138700YESYNoneCentral Air2020-04-18T00:00:002020-04-19T00:00:0012:00 P.M. - 4:00 P.M.12:00 P.M. - 4:00 P.M.https://www.youtube.com/watch?v=IUmti9DEQhw&t=12shttps://www.youtube.com/watch?v=IUmti9DEQhw&t=12s

    RECORD#2

    2USDLRUpper Saddle RiverUnfinished8.00805BERGEN02BERGEN2020-05-10T00:00:002020-04-14T10:35:332020-02-12T00:00:002020-02-12T00:00:002020-02-12T14:57:282020-04-16T15:34:45RStone Faced,Mason/stucco2 FireplacesBaseboard,Gas,Hot AirHuge, luxurious property located in a serene neighborhood. This property features 5 bedrooms, 8 bathrooms, a dining/living room, a spacious kitchen, and a family room. It includes 2 fireplaces, an unfinished basement, a 3-car garage with a large private driveway, an in-ground pool, and a greenhouse. Best and highest deadline on 4/17/2020 at 4:30 PMemilio.oscanoa@gmail.com1007732Emilio Oscanoa(973) 859-7537255 890Nicholas Real Estate Agency(973) 340-1202None20006148600RegularNone200061481816Y1050000.001175000.00RES1SepticACTIVEPeach Tree131313PlaceColonialOther0263Upper Saddle RiverNoneYESNone1980's074584/14/20203+car,Attached1150000.00NJNoneNoneCider Hill to Peach Tree Pl65Rep by LA(862) 340-1202Email mezahugo@yahoo.com for showing instructions. Property's address on subject line.A 358481050000.00No41.057910-74.083756YESNIn-groundCentral Air
    Attachments:
    You must be logged in to view attached files.
  • No matter how closely I look at those two columns, with the data from row 1, I cannot see any issue whatsoever. You can prove that to yourself by overwriting the dates in the two columns with dates that 'work' from earlier in the file and rerunning. I'll bet that the error remains, proving that this is not a data issue.

    There must be something else going on in the package.

    One thing you could try is adding a data viewer just before the data conversion component, just to satisfy yourself that the data going in is as expected.

    • This reply was modified 4 years, 7 months ago by  Phil Parkin.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Phil

    Thanks for you efforts so far.

    I see how to enable data viewer on the pipe prior to conversion, but where do I actually view the data? I design the package is VS, save it to disk, and run it though SSMS as a job.

    --

    Scot

  • To see the data interactively (and to do lots of other types of debugging), you need to run the package in Visual Studio.

    You will be saving yourself a lot of time, effort and frustration if you change the way you develop packages such that your initial development work is done entirely in VS.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • OK I ran it within VS:

     

    Error: 0xC02020C5 at Res import, Data Conversion 1 [2]: Data conversion failed while converting column "VOPENDATE_op" (1545) to column "VOPENDATE" (527). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".

    Error: 0xC0209029 at Res import, Data Conversion 1 [2]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Data Conversion 1.Outputs[Data Conversion Output].Columns[VOPENDATE]" failed because error code 0xC020907F occurred, and the error row disposition on "Data Conversion 1.Outputs[Data Conversion Output].Columns[VOPENDATE]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.

    Error: 0xC0047022 at Res import, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Data Conversion 1" (2) failed with error code 0xC0209029 while processing input "Data Conversion Input" (3). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.

    In data viewer, all looks good. First field imports fine, VOPENDATE_op does not:

    DATE_LIST_opVOPENDATE_op
    2020-04-03T00:00:00NULL
    2020-02-10T00:00:00NULL
    2020-04-16T00:00:00NULL
    2020-02-18T00:00:00NULL
    2020-03-18T00:00:00NULL
    2020-04-16T00:00:00NULL
    2020-03-20T00:00:00NULL
    2020-02-29T00:00:00NULL
    2020-03-02T00:00:00NULL
    2020-03-20T00:00:00NULL
    2020-02-25T00:00:00NULL
    2019-03-18T00:00:00NULL
    2020-01-13T00:00:00NULL
    2020-04-15T00:00:00NULL
    2020-04-09T00:00:00NULL
    2020-04-07T00:00:00NULL
    2020-04-17T00:00:002020-04-18T00:00:00
    2020-04-15T00:00:00NULL
    2020-01-15T00:00:00NULL
    2020-01-21T00:00:00NULL
    2020-04-16T00:00:00NULL
    2019-09-18T00:00:00NULL
    2019-12-12T00:00:00NULL
    2020-03-30T00:00:00NULL
    2019-12-03T00:00:00NULL
    2020-04-16T00:00:00NULL
    2020-04-13T00:00:00NULL

    Still at a loss...

     

  • It looks like a NULL handling issue then ...

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Could it be a problem not with the valid dates, but with the NULLS?

    As a quick test for that you could make 2 files, one that has a value in that date field and one that does not (use your 2 sample ones for example) and try importing a single row.  Could even turn off the destination portion of the SSIS package which would allow you to run the package repeatedly without updating any real data.

    I am just wondering if MAYBE the problem is in a different place in that chain.

    Also, as a semi-random thought, does your destination table allow NULLS in that column?

    The above is all just my opinion on what you should do. 
    As with all advice you find on a random internet forum - you shouldn't blindly follow it.  Always test on a test server to see if there is negative side effects before making changes to live!
    I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.

  • I thought is was NULLS also but many of the other date fields have NULL values and they are fine as well.  The data I posted above used another date field that is always populated. Here is another test using an existing date field that also has nulls just like the new field:

    DATE_COE_opVOPENDATE_op
    NULLNULL
    2020-04-16T00:00:00NULL
    NULLNULL
    NULLNULL
    2020-04-17T00:00:00NULL
    NULLNULL
    NULLNULL
    NULLNULL
    NULLNULL
    NULLNULL
    NULLNULL
    NULLNULL
    2020-04-17T00:00:00NULL
    NULLNULL
    2020-04-16T00:00:00NULL
    2020-04-14T00:00:00NULL
    NULLNULL
    NULLNULL
    NULLNULL
    NULLNULL
    NULLNULL
    NULLNULL
    NULL2020-04-18T00:00:00
    NULLNULL
    2020-04-09T00:00:00NULL
    NULLNULL
    NULLNULL
  • Just to rule out a NULL issue, try importing a test file containing 5 or 10 rows, with no NULLS in those columns.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • OK I modified the flat file to include only 1 row where the new fields are not null. Same results 🙁

     

  • As a thought... if you disable the writing to destination step of the SSIS package, do you get errors still?

    I am wondering if MAYBE the problem is on the destination table and not the SSIS package...

    Also, does your destination table have 132 columns in it like the input data file?  If so, I wonder if you are hitting a limit on the SQL side (8KB per row) and SSIS is just giving a bad error message...

    The above is all just my opinion on what you should do. 
    As with all advice you find on a random internet forum - you shouldn't blindly follow it.  Always test on a test server to see if there is negative side effects before making changes to live!
    I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.

  • No I had completely deleted the import into the table. It was 2 steps... Flat file source, and data conversion.

    In the meantime, here's an update. I got it working. Just through trial and error:

    1. Setup a derived column step, and did a LEFT(VOPENDATE,10). This stripped off the "T00:00:00" from the date string.
    2. change the data conversion step to convert VOPENDATE to DT_DBDATE, since the time component is gone (I don't need it anyway)

    Now the conversion completes, AND I can import the new date fields into the SQL table.

    I know.... there is no rhyme or reason why every other date field, formatted the same way, runs through data conversion as DBTIMESTAMP. But it works.

    Can't thank you guys enough, even if it was just to help me think outside the box to get this thing to work.

    --

    Scot

  • Well done on getting it working ... and if you don't need the time component, that's an even better solution than what you had originally planned.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

Viewing 15 posts - 1 through 14 (of 14 total)

You must be logged in to reply to this topic. Login to reply