Hi
I am using these rates for a calculation
Currently, I manually update a local table and then insert that table into SQL Server. It would be ideal if I can get SQL Server to directly access the rates from that site. Is this possible? If yes, how would I do that?
Thank you
It looks like you can now use PowerQuery as a source in SSIS. The nice thing is that Excel can write PowerQuery for you - all you have to do is click buttons. Then you can copy and paste that into SSIS (I think).
See this article: Power Query Source - SQL Server Integration Services (SSIS) | Microsoft Docs
In Excel (I'm using like 2019 or 365 or whatever it is)
Go to the Data Tab, Get Data, From Web... you'll get a popup prompting you for the web address. paste in the URL.
Then you have to choose the table the data is in. If you go to the Advanced Editor, you can read the M query that Excel (or PowerBI) wrote:
let
Source = Web.Page(Web.Contents("https://www.treasury.gov/resource-center/data-chart-center/interest-rates/pages/textview.aspx?data=yield")),
Data0 = Source{0}[Data],
#"Changed Type" = Table.TransformColumnTypes(Data0,{{"Date", type date}, {"1 mo", type number}, {"2 mo", type number}, {"3 mo", type number}, {"6 mo", type number}, {"1 yr", type number}, {"2 yr", type number}, {"3 yr", type number}, {"5 yr", type number}, {"7 yr", type number}, {"10 yr", type number}, {"20 yr", type number}, {"30 yr", type number}})
in
#"Changed Type"
Then you should be able to use that as a step in SSIS.
January 3, 2022 at 3:34 am
perfect!
Thank you so much
January 3, 2022 at 4:48 pm
It looks like you can now use PowerQuery as a source in SSIS. The nice thing is that Excel can write PowerQuery for you - all you have to do is click buttons. Then you can copy and paste that into SSIS (I think).
See this article: Power Query Source - SQL Server Integration Services (SSIS) | Microsoft Docs
In Excel (I'm using like 2019 or 365 or whatever it is) Go to the Data Tab, Get Data, From Web... you'll get a popup prompting you for the web address. paste in the URL.
Then you have to choose the table the data is in. If you go to the Advanced Editor, you can read the M query that Excel (or PowerBI) wrote:
let
Source = Web.Page(Web.Contents("https://www.treasury.gov/resource-center/data-chart-center/interest-rates/pages/textview.aspx?data=yield")),
Data0 = Source{0}[Data],
#"Changed Type" = Table.TransformColumnTypes(Data0,{{"Date", type date}, {"1 mo", type number}, {"2 mo", type number}, {"3 mo", type number}, {"6 mo", type number}, {"1 yr", type number}, {"2 yr", type number}, {"3 yr", type number}, {"5 yr", type number}, {"7 yr", type number}, {"10 yr", type number}, {"20 yr", type number}, {"30 yr", type number}})
in
#"Changed Type"Then you should be able to use that as a step in SSIS.
Hi
I am using these rates for a calculation
Currently, I manually update a local table and then insert that table into SQL Server. It would be ideal if I can get SQL Server to directly access the rates from that site. Is this possible? If yes, how would I do that?
Thank you
Now THAT's interesting. Just curious... do you know what the page will be for 2022? Or will it magically change at the end of the month?
The reason why I think this is so cool is that I just tried the steps the pietlinden posted and it make a very nice spreadsheet. That means I could built a macro into it and setup a job on a system that has Excel loaded on it (never an SQL Server, IMHO) and it could automagically download the data and save itself as a dated Excel spreadsheet file. Then, I can setup a T-SQL job that uses the ACE drivers to auto-magically read that spreasheet file into a table and Bob's your uncle.
This could be a game changer for some things that people do. I didn't know they had built that capability into Excel.
Thank you both for this thread! 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
January 3, 2022 at 5:35 pm
Now THAT's interesting. Just curious... do you know what the page will be for 2022? Or will it magically change at the end of the month?
Yes, that link should show all the years. All you have to do is refresh the query that pietlinden provided.
January 4, 2022 at 12:03 am
Now THAT's interesting. Just curious... do you know what the page will be for 2022? Or will it magically change at the end of the month?
Yes, that link should show all the years. All you have to do is refresh the query that pietlinden provided.
Since it's 2022 right now, how could you use that link to show the data from 2020?
--Jeff Moden
Change is inevitable... Change for the better is not.
January 4, 2022 at 12:14 am
water490 wrote:Now THAT's interesting. Just curious... do you know what the page will be for 2022? Or will it magically change at the end of the month?
Yes, that link should show all the years. All you have to do is refresh the query that pietlinden provided.
Since it's 2022 right now, how could you use that link to show the data from 2020?
That particular link is to show all years of data starting from 1990.
January 4, 2022 at 12:22 am
I want so say you create a function that generates the page(s) you need to go to. I saw Reza Rad do it in a class I took. It was mindboggling how much you can do in PowerQuery if you know what you're doing. You basically feed it a table of values and then use that to generate the page numbers or whatever in the web address(es). It's kinda like a Tally table trick. Should be on his website somewhere, or in his book PowerBI: From Rookie to Rockstar. (look it up online... it's a free download and absolutely huge).
January 4, 2022 at 12:52 am
better option for me would be to use a small powershell script to download the contents and load directly to sql server.
edit: the original link only extracts current year - for all years link is
for a particular year link is (replacing 2021 with desired year)
very rough attempt at it (working!!) as I don't have my personal scripts available
function ConvertTo-DataTable {
<#
.SYNOPSIS
Convert regular PowerShell objects to a DataTable object.
.DESCRIPTION
Convert regular PowerShell objects to a DataTable object.
.EXAMPLE
$myDataTable = $myObject | ConvertTo-DataTable
.NOTES
Name: ConvertTo-DataTable
Author: Øyvind Kallstad @okallstad
Version: 1.1
#>
[CmdletBinding()]
param (
# The object to convert to a DataTable
[Parameter(ValueFromPipeline = $true)]
[PSObject[]] $InputObject,
# Override the default type.
[Parameter()]
[string] $DefaultType = 'System.String'
)
begin {
# create an empty datatable
try {
$dataTable = New-Object -TypeName 'System.Data.DataTable'
Write-Verbose -Message 'Empty DataTable created'
}
catch {
Write-Warning -Message $_.Exception.Message
break
}
# define a boolean to keep track of the first datarow
$first = $true
# define array of supported .NET types
$types = @(
'System.String',
'System.Boolean',
'System.Byte[]',
'System.Byte',
'System.Char',
'System.DateTime',
'System.Decimal',
'System.Double',
'System.Guid',
'System.Int16',
'System.Int32',
'System.Int64',
'System.Single',
'System.UInt16',
'System.UInt32',
'System.UInt64'
)
}
process {
# iterate through each input object
foreach ($object in $InputObject) {
try {
# create a new datarow
$dataRow = $dataTable.NewRow()
Write-Verbose -Message 'New DataRow created'
# iterate through each object property
foreach ($property in $object.PSObject.get_properties()) {
# check if we are dealing with the first row or not
if ($first) {
# handle data types
if ($types -contains $property.TypeNameOfValue) {
$dataType = $property.TypeNameOfValue
Write-Verbose -Message "$($property.Name): Supported datatype <$($dataType)>"
}
else {
$dataType = $DefaultType
Write-Verbose -Message "$($property.Name): Unsupported datatype ($($property.TypeNameOfValue)), using default <$($DefaultType)>"
}
# create a new datacolumn
$dataColumn = New-Object 'System.Data.DataColumn' $property.Name, $dataType
Write-Verbose -Message 'Created new DataColumn'
# add column to DataTable
$dataTable.Columns.Add($dataColumn)
Write-Verbose -Message 'DataColumn added to DataTable'
}
# add values to column
if ($property.Value -ne $null) {
# if array or collection, add as XML
if (($property.Value.GetType().IsArray) -or ($property.TypeNameOfValue -like '*collection*')) {
$dataRow.Item($property.Name) = $property.Value | ConvertTo-Xml -As 'String' -NoTypeInformation -Depth 1
Write-Verbose -Message 'Value added to row as XML'
}
else{
$dataRow.Item($property.Name) = $property.Value -as $dataType
Write-Verbose -Message "Value ($($property.Value)) added to row as $($dataType)"
}
}
}
# add DataRow to DataTable
$dataTable.Rows.Add($dataRow)
Write-Verbose -Message 'DataRow added to DataTable'
$first = $false
}
catch {
Write-Warning -Message $_.Exception.Message
}
}
}
end { Write-Output (,($dataTable)) }
}
function get-table
{
param(
[Parameter(Mandatory = $true)]
[Microsoft.PowerShell.Commands.HtmlWebResponseObject] $WebRequest,
[Parameter(Mandatory = $true)]
[int] $TableNumber
)
## Extract the tables out of the web request
$tables = @($WebRequest.ParsedHtml.getElementsByTagName("TABLE"))
$table = $tables[$TableNumber]
$titles = @()
$rows = @($table.Rows)
## Go through all of the rows in the table
foreach($row in $rows)
{
$cells = @($row.Cells)
## If we've found a table header, remember its titles
if($cells[0].tagName -eq "TH")
{
$titles = @($cells | % { ("" + $_.InnerText).Trim() })
continue
}
## If we haven't found any table headers, make up names "P1", "P2", etc.
if(-not $titles)
{
$titles = @(1..($cells.Count + 2) | % { "P$_" })
}
## Now go through the cells in the the row. For each, try to find the
## title that represents that column and create a hashtable mapping those
## titles to content
$resultObject = [Ordered] @{}
for($counter = 0; $counter -lt $cells.Count; $counter++)
{
$title = $titles[$counter]
if(-not $title) { continue }
$resultObject[$title] = ("" + $cells[$counter].InnerText).Trim()
}
## And finally cast that hashtable to a PSCustomObject
[PSCustomObject] $resultObject
}
}
$url = 'https://www.treasury.gov/resource-center/data-chart-center/interest-rates/pages/TextView.aspx?data=yieldYear&year=2021'
$r = Invoke-WebRequest $url
$tables = @($r.ParsedHtml.getElementsByTagName("TABLE"))
# below just to output to a file to check contents
#get-table -WebRequest $r -TableNumber 1|Export-Csv -NoTypeInformation "c:\temp\test.csv"
$s=get-table -WebRequest $r -TableNumber 1|ConvertTo-DataTable
$ConnectionString = "Data Source=servername; Initial Catalog=test; Trusted_Connection=True;";
$bulkCopy = new-object ("Data.SqlClient.SqlBulkCopy") $ConnectionString
$bulkCopy.DestinationTableName = "rates"
$bulkCopy.WriteToServer($s)
<#
create above table as follows - without a bit of work (not much) we could create a datatable with correct datatypes
and with other names.
we could also involve this on a transaction, add the data onto a temp table
and then process it onto the final table with correct names and datatypes
create table rates
(
[Date] varchar(50)
,[1 mo] decimal(10, 4)
,[2 mo] decimal(10, 4)
,[3 mo] decimal(10, 4)
,[6 mo] decimal(10, 4)
,[1 yr] decimal(10, 4)
,[2 yr] decimal(10, 4)
,[3 yr] decimal(10, 4)
,[5 yr] decimal(10, 4)
,[7 yr] decimal(10, 4)
,[10 yr] decimal(10, 4)
,[20 yr] decimal(10, 4)
,[30 yr] decimal(10, 4)
)
#>
January 4, 2022 at 1:29 am
That's awesome, Frederico! Thanks for posting that.
It would appear, though, that you actually have to program for the table structure. Is there anything like the getting data from the Web trick that Excel has? Basically, you define the source URL and a couple of settings, click the final "OK" button and it's all done for you.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 4, 2022 at 4:09 am
This was removed by the editor as SPAM
January 4, 2022 at 8:21 am
That's awesome, Frederico! Thanks for posting that.
It would appear, though, that you actually have to program for the table structure. Is there anything like the getting data from the Web trick that Excel has? Basically, you define the source URL and a couple of settings, click the final "OK" button and it's all done for you.
Yes you do need to program for your desired table structure - I have found another scriplet that will create the table for you but most times that is not what you would desire in any case.
but the above can be used to output to a csv file which you then use as standard in SQL.
but having said that with a bit of programming it is possible to create a set of scripts that will do that for you - just not something that most folks will spend time on unless there is a reason for it.
January 14, 2022 at 12:43 pm
Just saw a video from Chandoo (Excel MVP), right here. Watch the next to last section - the biggest game changer in Excel in the last year is PowerQuery, according to him. You can basically build your data transformation pipelines and see the outputs... (well, watch that part of the video!). He has tons of videos on YouTube... definitely worth watching - no fluff, no sales pitch, just really good content.
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply