Back in October 2016, Microsoft started a new GitHub repository – ReportingServicesTools – an open source PowerShell module to manage SQL Server Reporting Services.
This project contains PowerShell scripts that allows you to perform various operations with SQL Server Reporting Services.
This module, as of today, provides 28 commands that will help accomplish multiple tasks with less effort.
- Want to verify if all reports use the same data source?
- Need to backup your encryption key?
- Download or upload a bunch of reports and/or data sources?
- Granting or revoke access?
- And more
Aaron Nelson (t) has written quite a bit about this module but I haven’t worked with SQL Server Reporting Services enough to warrant using it. But now, I have received a request that took me a lot of time to do manually. I don’t like to do repetitive and tedious work (I call it “robot work”), specially manual work where it is more error prone.
In this post I will share with you the request and how I have automated it saving a lot of time. Just to keep you interested, I went from 23 and a half minutes to 3 (your mileage may vary depending on the number of objects/actions that you need to do).
The request
- Create new folder “FolderB”
- We need to deploy a copy of the reports and data source to a new folder (“FolderB”). You should get the existing ones from the folder “FolderA” on the same server.
- Then you have to change the datasource to point to the database “dbRS” with the login “ReportingUser”
- Finally we need to change the data source for each report to match the new datasource pointing to database “dbRS” created on last step.
Let’s translate it into detailed tasks:
- We need to create a folder – no big deal!
- Copy reports from one folder to another – which means, download each report and then upload each one to the new folder. How many reports were in my scenario? 14!
- Create a new data source and point to instance/database and with specific login
- Change the data source for each report – how much time it takes times 14 or, in my case 10 because 4 of them does not have data source (one is used as footer and three as headers).
Doing some math
Let’s see how much time (estimations based on my experience/tests) we will need to spend on this:
- 15 seconds
- Here we have two main tasks. Download and upload.
- Download: From my tests, download a single report takes about 10 seconds (remember we use the browser to download them). 10 seconds times 14 reports = 140 seconds. That’s an additional 2 minutes and 20 seconds.
- Upload: Click on “Upload file” -> Browse -> Select -> Ok (close browse window) -> Ok (will do the upload). This takes about 10 seconds times 14 reports = 140 seconds.
- Create from scratch, I will say 60 seconds if you have backed up the connection string somewhere else so you can copy & paste to be faster.
- The most boring one..let me ask you, how many clicks we need to do it? Any idea? I tell you eight! Yes 8 clicks. Which I estimate in 10 seconds per report plus the time to go back to the folder (let’s say 10 sec on total). Also if we don’t know beforehand which reports have or don’t have a data source we can click on its properties and find that there is no “Data Source” option, you spent the time for nothing. So for 10 reports let’s estimate 1 minute and 55 seconds
Sum them all and we will have at least 15 + 280 + 60 + 110 = 470 seconds or 7 minutes and 45 seconds
Is that all?
Well, that will depend on how many environments you administer.
In my case is at least 3! Which means that will be more than 20 minutes.
But only if:
- nothing fails
- we don’t click on the wrong place
- we don’t generate some typos (remember, we are doing it by hand)
And what if this is a common request for other applications? Do we want to repeat all the work again and again?
Automating it with ReportingServicesTools module
Installation
To get this module you can run one of the following lines of code:
Install-Module -Name ReportingServicesTools
or if you don’t have access to the PowerShell Gallery you can use:
Invoke-Expression (Invoke-WebRequest https://aka.ms/rstools)
To import the module to start using it run:
Import-Module "$home\Documents\WindowsPowerShell\Modules\ReportingServicesTools\src\ReportingServicesTools.psd1" -Force
The code
The code you will need for each task. Don’t forget to change the values for each variable.
1. Create new folder “FolderB”
#Report server URL parameter $reportServerUri = 'http://localhost/ReportServer_URL' #destination parameters $destinationFolderPath = "/" $destinationFolderName = "FolderB" # Creates a new folder on the root of report server New-RsFolder -ReportServerUri $reportServerUri -RsFolder $destinationFolderPath -FolderName $destinationFolderName
2. We need to deploy a copy of the reports and data source to a new folder (“FolderB”). You should get the existing ones from the folder “FolderA” on the same server.
First, the download to a folder
$downloadFolder
#Report server URL parameter $reportServerUri = 'http://localhost/ReportServer_URL' #source parameters $sourceRSFolder = "/FolderA" $downloadFolder = "D:\Temp\FolderA_Reports" #Download all objects of type Report Get-RsFolderContent -ReportServerUri $reportServerUri -RsFolder $sourceRSFolder | Where-Object TypeName -eq 'Report' | Select-Object -ExpandProperty Path | Out-RsCatalogItem -ReportServerUri $reportServerUri -Destination $downloadFolder
Then, the upload
#Report server URL parameter $reportServerUri = 'http://localhost/ReportServer_URL' #destination parameters $destinationFolderPath = "/" #source parameters $sourceRSFolder = "/FolderA" $newRSFolderPath = "$destinationFolderPath/$destinationFolderName" #Upload all files from the download folder Write-RsFolderContent -ReportServerUri $reportServerUri -Path $downloadFolder -RsFolder $newRSFolderPath
3. Then you have to change the datasource to point to the database “dbRS”. Also, when running this piece of code you will be prompted to set a user (the “ReportingUser”) and a password
#Report server URL parameter $reportServerUri = 'http://localhost/ReportServer_URL' #data source configuration $newRSDSFolder = "$destinationFolderPath/$destinationFolderName" $newRSDSName = "Datasource" $newRSDSExtension = "SQL" $newRSDSConnectionString = "Initial Catalog=dbRS; Data Source=instance" $newRSDSCredentialRetrieval = "Store" $newRSDSCredential = Get-Credential #Add new datasource New-RsDataSource -ReportServerUri $reportServerUri -RsFolder $newRSDSFolder -Name $newRSDSName -Extension $newRSDSExtension -ConnectionString $newRSDSConnectionString -CredentialRetrieval $newRSDSCredentialRetrieval -DatasourceCredentials $newRSDSCredential
4. Finally we need to change the data source for each report to match the new datasource pointing to database “dbRS” created in last step.
#Report server URL parameter $reportServerUri = 'http://localhost/ReportServer_URL' #destination parameters $destinationFolderPath = "/" $destinationFolderName = "FolderB" #data source parameters $newRSDSFolder = "$destinationFolderPath/$destinationFolderName" $DataSourcePath = "$newRSDSFolder/$newRSDSName" # Set report datasource Get-RsCatalogItems -ReportServerUri $reportServerUri -RsFolder $newRSFolderPath | Where-Object TypeName -eq 'Report' | ForEach-Object { $dataSource = Get-RsItemReference -ReportServerUri $reportServerUri -Path $_.Path if ($dataSource -ne $null) { Set-RsDataSourceReference -ReportServerUri $reportServerUri -Path $_.Path -DataSourceName $dataSource.Name -DataSourcePath $DataSourcePath Write-Output "Changed datasource $($dataSource.Name) set to $DataSourcePath on report $($_.Path) " } else { Write-Warning "Report $($_.Path) does not contain an datasource" } }
This one seems to be a little bit more complex mainly because there is not one command to do it all.
Explaining it
We get all items of type Report inside our new folder
$destinationFolderPath
variable and for each item we will get its reference (using
Get-RsItemReference
command on line 3). Then if we have a
$dataSource
(remember that some rdl may not have a datasource) we will use the
Set-RsDataSourceReference
command to specify the new data source to each report.
Putting all together
See it in action!
Here is the full script with all variables. You just need to change values of the variables according with your environment and test it.
Import-Module "$home\Documents\WindowsPowerShell\Modules\ReportingServicesTools\src\ReportingServicesTools.psd1" -Force #Report server URL parameter $reportServerUri = 'http://localhost/ReportServer_URL' #source parameters $sourceRSFolder = "/FolderA" $downloadFolder = "D:\Temp\FolderA_Reports" #destination parameters $destinationFolderPath = "/" $destinationFolderName = "FolderB" $newRSFolderPath = "$destinationFolderPath/$destinationFolderName" #data source configuration $newRSDSFolder = "$destinationFolderPath/$destinationFolderName" $newRSDSName = "Datasource" $newRSDSExtension = "SQL" $newRSDSConnectionString = "Initial Catalog=Db1; Data Source=Server1" $newRSDSCredentialRetrieval = "Store" $newRSDSCredential = Get-Credential -Message "Enter user credentials for data source" $DataSourcePath = "$newRSDSFolder/$newRSDSName" # Creates a new folder on the root of report server New-RsFolder -ReportServerUri $reportServerUri -RsFolder $destinationFolderPath -FolderName $destinationFolderName #Download all objects of type Report Get-RsFolderContent -ReportServerUri $reportServerUri -RsFolder $sourceRSFolder | Where-Object TypeName -eq 'Report' | Select-Object -ExpandProperty Path | Out-RsCatalogItem -ReportServerUri $reportServerUri -Destination $downloadFolder #Download all files in the diretory #Out-RsFolderContent -ReportServerUri $reportServerUri -RsFolder $sourceRSFolder -Destination $downloadFolder #Upload all files from the download folder Write-RsFolderContent -ReportServerUri $reportServerUri -Path $downloadFolder -RsFolder $newRSFolderPath #Add new datasource New-RsDataSource -ReportServerUri $reportServerUri -RsFolder $newRSDSFolder -Name $newRSDSName -Extension $newRSDSExtension -ConnectionString $newRSDSConnectionString -CredentialRetrieval $newRSDSCredentialRetrieval -DatasourceCredentials $newRSDSCredential # Set report datasource Get-RsCatalogItems -ReportServerUri $reportServerUri -RsFolder $newRSFolderPath | Where-Object TypeName -eq 'Report' | ForEach-Object { $dataSource = Get-RsItemReference -ReportServerUri $reportServerUri -Path $_.Path if ($dataSource -ne $null) { Set-RsDataSourceReference -ReportServerUri $reportServerUri -Path $_.Path -DataSourceName $dataSource.Name -DataSourcePath $DataSourcePath Write-Output "Changed datasource $($dataSource.Name) set to $DataSourcePath on report $($_.Path) " } else { Write-Warning "Report $($_.Path) does not contain an datasource" } }
Wrap up
I hope that you can leverage this script to accomplish your common tasks or at least use it to guide you through the a different task.
Remember all the time you will save and the probability of error will come down drastically.
Because I’m always improving my scripts, if you want to get the most recent version of the code you can download it from my GitHub repository.
NOTE: The number 700 used on the title was calculated based on the 7 minutes and 45 seconds of manual work versus 1 minute (the script runs under 30 seconds) when using this automated script. To be precise, is 783 times faster
Thank you for reading.