Can't Import File

  • Hi everyone

    I started another thread on downloading CSV file from web.  I have come across a problem that I can't understand what is the issue so I am starting a different thread for it.  I cannot import the file into SQL Server.  Does anyone know why I cannot import the file?

    Thank you

    ===

    File

    ===

    https://home.treasury.gov/resource-center/data-chart-center/interest-rates/daily-treasury-rates.csv/all/all?type=daily_treasury_yield_curve&field_tdr_date_value=all&page&_format=csv

    ====

    Error

    ====

    - Executing (Error)
    Messages
    Information 0x402090de: Data Flow Task 1: The total number of data rows processed for file "C:\Users\Me\Discount_Rates\rates.csv" is 8052.
    (SQL Server Import and Export Wizard)

    Error 0xc02020c5: Data Flow Task 1: Data conversion failed while converting column "RATE_MONTH_2" (18) to column "RATE_MONTH_2" (63). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
    (SQL Server Import and Export Wizard)

    Error 0xc0209029: Data Flow Task 1: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Data Conversion 0 - 0.Outputs[Data Conversion Output].Columns[RATE_MONTH_2]" failed because error code 0xC020907F occurred, and the error row disposition on "Data Conversion 0 - 0.Outputs[Data Conversion Output].Columns[RATE_MONTH_2]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
    (SQL Server Import and Export Wizard)

    Error 0xc0047022: Data Flow Task 1: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Data Conversion 0 - 0" (47) failed with error code 0xC0209029 while processing input "Data Conversion Input" (48). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
    (SQL Server Import and Export Wizard)

    ===========

    Table Definition

    ===========

    CREATE TABLE [dbo].[DiscountRatesRaw](
    [QUOTE_DATE] [date] NOT NULL,
    [RATE_MONTH_1] [numeric](10, 9) NULL,
    [RATE_MONTH_2] [numeric](10, 9) NULL,
    [RATE_MONTH_3] [numeric](10, 9) NULL
    ) ON [PRIMARY]
    GO

    ============

    PowerShell Script

    ============

    $Source = "https://home.treasury.gov/resource-center/data-chart-center/interest-rates/daily-treasury-rates.csv/all/all?type=daily_treasury_yield_curve&field_tdr_date_value=all&page&_format=csv"
    $DestinationTemp1 = "C:\Users\Me\Discount_Rates\ratestemp1.csv"
    $DestinationTemp2 = "C:\Users\Me\Discount_Rates\ratestemp2.csv"
    $Destination = "C:\Users\Me\Discount_Rates\rates.csv"

    Invoke-WebRequest -Uri $source -OutFile $DestinationTemp1

    $oneshot = $true
    Import-Csv $DestinationTemp1 |
    ForEach-Object {
    if ($oneshot){
    $oneshot = $false
    "QUOTE_DATE,RATE_MONTH_1,RATE_MONTH_2,RATE_MONTH_3" # header
    }
    "{0},{1},{2},{3}" -f ([datetime]::ParseExact($_.Date, "M/d/yyyy", $null)).ToShortDateString(), $_."1 Mo", $_."2 Mo", $_."3 Mo"
    } | Out-File $DestinationTemp2 -Encoding Utf8

    Get-Content $DestinationTemp2 -Encoding UTF8 | ForEach-Object {$_ -replace '"',''} | Out-File $Destination -Encoding UTF8

    Remove-Item $DestinationTemp1
    Remove-Item $DestinationTemp2

     

     

     

     

     

     

     

  • At first I thought the problem was with the PowerShell-produced file because I was importing that file.  Then it occurred to me to try the one from the government site.  Same issue.

  • I think the following error line from those that you provided is the key hint as to what is wrong...

    Error 0xc02020c5: Data Flow Task 1: Data conversion failed while converting column "RATE_MONTH_2" (18) to column "RATE_MONTH_2" (63). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".

    (SQL Server Import and Export Wizard)

    It would seem to me that the RATE_MONTH_2 column is too small for the data.  You have DECIMAL(0.9) as the datatype.  That only leaves one digit to the left of the decimal point and, perhaps, the data you're trying to import has more than that.

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden wrote:

    I think the following error line from those that you provided is the key hint as to what is wrong...

    Error 0xc02020c5: Data Flow Task 1: Data conversion failed while converting column "RATE_MONTH_2" (18) to column "RATE_MONTH_2" (63). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.". (SQL Server Import and Export Wizard)

    It would seem to me that the RATE_MONTH_2 column is too small for the data.  You have DECIMAL(0.9) as the datatype.  That only leaves one digit to the left of the decimal point and, perhaps, the data you're trying to import has more than that.

    There "should" be enough space. The field is defined as [RATE_MONTH_2] [numeric](10, 9) NULL so that means it can support a decimal  of type 0.999999999.  The values in the CSV are interest rates so they will always be of the form 0.something.  I even tried [RATE_MONTH_2] [numeric](11, 9) NULL and still same issue.

    This SSIS package was working fine a few months ago with that table definition.  The government site was changed recently and now the import fails.  Strange.

    When it says ""RATE_MONTH_2" (18)" does this mean that the 18th record in column RATE_MONTH_2?

    How can I load that file?

  • water490 wrote:

    Jeff Moden wrote:

    I think the following error line from those that you provided is the key hint as to what is wrong...

    Error 0xc02020c5: Data Flow Task 1: Data conversion failed while converting column "RATE_MONTH_2" (18) to column "RATE_MONTH_2" (63). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.". (SQL Server Import and Export Wizard)

    It would seem to me that the RATE_MONTH_2 column is too small for the data.  You have DECIMAL(0.9) as the datatype.  That only leaves one digit to the left of the decimal point and, perhaps, the data you're trying to import has more than that.

    There "should" be enough space. The field is defined as [RATE_MONTH_2] [numeric](10, 9) NULL so that means it can support a decimal  of type 0.999999999.  The values in the CSV are interest rates so they will always be of the form 0.something.  I even tried [RATE_MONTH_2] [numeric](11, 9) NULL and still same issue.

    This SSIS package was working fine a few months ago with that table definition.  The government site was changed recently and now the import fails.  Strange.

    When it says ""RATE_MONTH_2" (18)" does this mean that the 18th record in column RATE_MONTH_2?

    How can I load that file?

    RATE_MONTH_2  can hold 0.999999999

    But any value equal to or greater than +- 10 will fail.

    Michael L John
    If you assassinate a DBA, would you pull a trigger?
    To properly post on a forum:
    http://www.sqlservercentral.com/articles/61537/

  • Just a thought, start by importing the file as all character columns, then analyse the data and reconstruct the target table.

    😎

    I don't like the Import AND Export Wizard, pretty daft piece of xyz.

     

  • One of the messages states a total of 8052 rows were processed.  Validate the total number of rows in the file - if there are more than that number of rows then look at that row and the following row to see if there is anything invalid.

    If there are only 8052 total rows - but there are additional blank rows or multiple rows with just the delimiter, you would get that same type of error because a blank is not a valid value for a numeric data type.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • water490 wrote:

    Jeff Moden wrote:

    I think the following error line from those that you provided is the key hint as to what is wrong...

    Error 0xc02020c5: Data Flow Task 1: Data conversion failed while converting column "RATE_MONTH_2" (18) to column "RATE_MONTH_2" (63). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.". (SQL Server Import and Export Wizard)

    It would seem to me that the RATE_MONTH_2 column is too small for the data.  You have DECIMAL(0.9) as the datatype.  That only leaves one digit to the left of the decimal point and, perhaps, the data you're trying to import has more than that.

    There "should" be enough space. The field is defined as [RATE_MONTH_2] [numeric](10, 9) NULL so that means it can support a decimal  of type 0.999999999.  The values in the CSV are interest rates so they will always be of the form 0.something.  I even tried [RATE_MONTH_2] [numeric](11, 9) NULL and still same issue.

    This SSIS package was working fine a few months ago with that table definition.  The government site was changed recently and now the import fails.  Strange.

    When it says ""RATE_MONTH_2" (18)" does this mean that the 18th record in column RATE_MONTH_2?

    How can I load that file?

    "Should".  That's one of the most interesting words ever created.

    Temporarily change the target table to accept a VARCHAR(100) or so so you can actually import the data and check the max length, etc. etc.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Eirikur Eiriksson wrote:

    Just a thought, start by importing the file as all character columns, then analyse the data and reconstruct the target table. 😎 I don't like the Import AND Export Wizard, pretty daft piece of xyz.

    That's what I usually do, because of similar issues the OP is having.

  • water490 wrote:

    The values in the CSV are interest rates so they will always be of the form 0.something.  I even tried [RATE_MONTH_2] [numeric](11, 9) NULL and still same issue.

    Have you looked at the file? I assume that you are inserting the 2Mo column into RATE_MONTH_2. The max value of  the 2Mo column is 2.48. This would fail the initial column definition, but it does work for DECIMAL(11,9), However, there are a number of rows with no value for this column (at least when I opened it in Excel) and as was previously mentioned, you cannot convert an empty string to numeric. You can convert a null to numeric. I believe that there is an option in SSIS to treat empty strings as null, but I don't see it in the import wizard. The task would easier if you imported into a table of varchars and then used TRY_CONVERT to insert into a table with the desired data types (As suggested previously).

    • This reply was modified 2 years, 8 months ago by  Ed B.
  • homebrew01 wrote:

    Eirikur Eiriksson wrote:

    Just a thought, start by importing the file as all character columns, then analyse the data and reconstruct the target table. 😎 I don't like the Import AND Export Wizard, pretty daft piece of xyz.

    That's what I usually do, because of similar issues the OP is having.

    That's how I normally start and then determine the datatypes from there.  There are times, though, where I'm importing 800 columns at a time and it's just too much manual labor to do that type of thing.  The good part in those situations is it's normally (for me) coming from a known source that has thousands of customers (like me) and so the the "record layout" can be trusted a whole lot (I parse the electronic version of that and have it build the table and the BCP format file, which I almost always use).

    Still, I trust almost nothing when it comes to importing data and so I'll also take the time to setup error handling either in the BCP or BULK INSERT that I use for imports.  If an error does occur, it's setup to sequester the rows for manual correct later and continue. I also do a little trick to have it tell me the human understandable error instead of what is normally saved in the designated error file, which is a pain to decipher.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Thank you everyone who replied!

    I know what the problem is.  I looked at the old Power Query script that worked before and in that I was replacing empty cells with a 0.  That is why that SSIS package succeeded.  I have to revise my PowerShell script so it converts empty cells to 0.

    Thank you again!!

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply