Blog Post

So You Want to Deploy Power BI Project files (PBIPs)?

,

Have you heard the news about the new Power BI Project files? Okay, maybe not news anymore since it was announced over a year ago. Just in case you hadn’t heard, Microsoft is using a new format “payload” that is human readable (it’s json) instead of a binary format like the original .PBIX. This is great news for source control, you can now easily see the differences between versions, so you know exactly what changed.

This new “payload” format essentially “unzips” the contents of the pbix and stores it in an unzipped format. This payload consists of a .pbip file and one or more folders containing all the parts and pieces you need for your report and/or semantic model.

When it was announced there was a collective cheer from Power BI source control advocates heard ’round the world. Since it’s preview release, Microsoft has also added GIT integration with Fabric workspaces. This makes it so easy to incorporate source control for all (or almost all) of your Fabric artifacts, including Power BI.

But what happens when your organization already has a mature CI/CD process in place using Azure DevOps? Do you really want to break from that pattern and have it controlled somewhere else? That’s what this post is about, using Azure DevOps CI/CD pipelines to deploy your Power BI Project files (.pbip).

I’m going to share my experience in hopes that it will save you some time if this is the route you need to take.

Prerequisites

  • Power BI premium capacity workspace or Fabric workspace – For Power BI workspaces, this can be a PPU workspace or a dedicated capacity SKU; for Fabric workspaces, this can be any workspace backed by any F SKU
  • Azure DevOps Repo – Repository for your source code and pipelines
  • Service Principal – Used by the Azure DevOps pipeline to authenticate to the Power BI service, this account will also need at least contributor permission on the workspaces you are deploying to
  • Fabric PowerShell command-lets – Rui Romano at Microsoft has created these and made them publicly available via GitHub – they serve as a wrapper for the Fabric APIs
  • PowerShell 7.0 or higher – The Fabric PowerShell Command-lets require at least PowerShell 7.0 or higher
  • Power BI Desktop March 2024 or later – You will need this to create the Power BI project files

Decisions To Make

There are some decisions that will need to be made before you get started. These decisions should be carefully thought out before you proceed.

  • Will your organization be separating semantic models from reports, which is a best practice for encouraging semantic model reuse? This becomes important when thinking about how to structure your repo.
    • I chose to separate my semantic models from reports, to encourage semantic model reuse.
  • How will your organization structure your repo? Are you creating a separate repo for Power BI artifacts? What will the folder structure look like for Power BI items in your repo? This becomes important for scalability.
    • I chose to use a folder structure that had the deploy type (semantic model or report) at the top, followed by the name of the workspace. The path for semantic models would look something like <repo root>Datasets<semantic model workspace name><your pbip file/payload>. (I purposefully chose to use the word “datasets” instead of semantic models because you are limited to the number of characters in the path to 256, so saving characters where I can.) For reports, it would look something like <repo root>Reports<report workspace name><your pbip file/payload>.
  • Does your organization have the PowerShell skills? I’m going to assume yes, since your organization already has a mature CI/CD process in place using Azure DevOps. This will be important when it comes to building payloads for deploy.
    • Most of the PowerShell you will need is around the IO file system, but you will also need to be familiar with looping and conditional statements.

Creating the Pipelines

In Azure DevOps, you have pipeline pipelines (no, that is not a type-o) and release pipelines. This has always confused me, they are both pipelines, but “pipeline pipelines” just sound weird to me. My OCD brain needs something to distinguish them, so I call pipelines “build pipelines”. For release pipelines, well, my brain accepts “release pipelines”, so all good there. But I digress.

Build Pipeline

I used the build pipeline to build my payload of files needed for deploy based on the files that have changed since the last commit. Now you may be asking, why do you need to build a payload? We know what files changed, so what more do we need? Well, that’s where the PowerShell Fabric command-lets come in. You can either deploy a single item or you can deploy multiple items. The catch is the parameter for the item(s) to deploy is a folder, not a single file.

I did a bit of poking around in the command-lets code and discovered it’s deploying the .SemanticModel and/or .Report folder(s) when it calls the Fabric API. These folders are part of the “unzipped” payload of the Power BI Project and they contain all those parts and pieces that are need for your semantic model and/or report, so you have to deploy all those files/folders. But if you made a change that only affected one file in one of those folders, it won’t show up when you look at only the files that changed since the last commit. This is why you have to build a payload of files based on the file(s) that changed. This is where those PowerShell file system command-lets come in, along with looping and conditional statements. Once you have that payload of files, you need to put them in a place where your release pipeline can pick them up and proceed with the actual deploy.

Release Pipeline

I used the release pipeline to do the actual deploy of the files in the payload created by the build pipeline. This is where those PowerShell Fabric command-lets come into play. I used PowerShell again to inspect the payload to determine what parameters to pass to the command-lets, then did the deploy. Because I though carefully about how to structure my repo, I was able to easily deploy on a per workspace basis with a little bit of PowerShell looping. This ensures a very scalable solution. It doesn’t matter if I make changes to semantic models/reports in more than one workspace, if the changes are in the same commit, they all go, regardless of workspace.

Assumptions

I did make some assumptions when I created these pipelines,

  • This process will only to be used for Development build/release
    • Why am I mentioning this? Because there’s this pesky thing called connections. In the paradigm I am using, where we separate the semantic models from the reports (to encourage semantic model reuse), in development, I am assuming the connection to the semantic model in the report will not change in a development deploy. This means that whatever the connection is in the report, it will be the connection when it goes to the Power BI service.
  • Semantic models will already exist in the Power BI service that are used by reports
    • When you separate the semantic model from the report, when you create the report, the semantic model must already exist in the Power BI service in order to create that connection in the report. This means that you will need to check in/sync your local branch with the remote branch where your semantic model creation/changes live before you can create any reports that use those semantic models.
  • When deploying to any environment other than development, you will either have to use a different release pipeline that will modify the connection or modify your release pipeline to modify connections
    • There are options for editing the connection of a report/dataset. You can use the PowerShell Fabric command-lets to do this. The catch is that you need to have a really good naming convention in place to make this happen dynamically. (This is still on my to-do list, so I’m sure there will be another blog post coming once I get it done.)

I hope you found this post useful. These are things that I wish I had known before I started, so I thought they might be useful to others. I’m working on anonymizing my code so I can make it available via GitHub. Stay tuned for details.

Original post (opens in new tab)
View comments in original post (opens in new tab)

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating