Introduction
Version control is a vital aspect of managing any dynamic system, especially when it comes to data workflows. As organizations increasingly rely on Power BI for business intelligence, ensuring that your dataflows are backed up regularly and versioned correctly is essential.
However, a notable limitation in the current Fabric setup is that Dataflow Gen2 is not tracked via Git. This necessitates alternative strategies for backup and version control.
This article explores a PowerShell script designed to automate the daily backup of Dataflow Gen2 from the Power BI Service. By implementing this solution, you can maintain historical versions of your dataflows and safeguard your data assets, even in the absence of native Git integration.
Why Version Control is Essential
Power BI has revolutionized how businesses handle data visualization and analytics. However, managing dataflows within Power BI can be challenging, particularly when it comes to maintaining different versions over time. Dataflow Gen2, with its enhanced capabilities, brings new opportunities but also requires a more robust versioning strategy. This is where automation through scripting becomes indispensable.
Use Case
Imagine you are an analyst managing multiple Dataflow/Dataflow Gen2 instances in Power BI Service. You need to ensure that every version of these dataflows is archived daily to track changes and revert to previous versions if necessary. Manually copying these dataflows daily would be tedious and error-prone. Instead, by using a PowerShell script, you can automate this task, ensuring that every version is securely stored without manual intervention.
Prerequisites
Before diving into the script, ensure you have the following prerequisites in place:
- Updated Version of Power Shell: Install the latest version of PowerShell on your system. See Install Power Shell on Windows
- Install Power BI Module in Power Shell: Install the Power BI module to interact with Power BI resources. see how to install Power BI module.
- Use Windows Power Shell ISE: Use the PowerShell Integrated Scripting Environment (ISE) for writing and testing your scripts.
Script Overview and Key Code Explanations
Let’s break down the key portions of the script to understand its functionality and application.
- Connecting to Power BI Service
Connect-PowerBIServiceAccount
Explanation: This command establishes a connection with the Power BI Service. It is a prerequisite for executing any operations that involve interacting with Power BI resources.
- Retrieving and Filtering Workspaces
$Workspaces = Get-PowerBIWorkspace
$WorkspaceFilterNames = ‘Data’
foreach($Workspace in $Workspaces) {
if($Workspace.Name -eq $WorkspaceFilterNames) {
…
}
}
Explanation: This section retrieves all available Power BI workspaces and filters them by name. The filter is set to match workspaces named “Data”, focusing the script’s operations on relevant areas and improving efficiency.
- Managing Backup Directories
if (Test-Path -Path “$Folder\$FolderName”) {
$timestamp = Get-Date -Format “yyyy-MM-dd_HH-mm-ss”
$NewFolderName = “$FolderName-$timestamp”
$FolderName = New-Item -Path $Folder -Name $NewFolderName -ItemType “directory”
} else {
$FolderName = New-Item -Path $Folder -Name $FolderName -ItemType “directory”
}
Explanation: This segment checks if a backup folder already exists for the workspace. If it does, the script appends a timestamp to create a unique folder for the current backup session. This prevents overwriting previous backups and maintains a clear version history.
- Exporting Dataflows
foreach($Dataflow in $Dataflows) {
$ExportFileName = Join-Path $FolderName -ChildPath ($Dataflow.Name + “.json”)
Export-PowerBIDataflow -WorkspaceId $Workspace.Id -Id $Dataflow.Id -Scope Individual -OutFile $ExportFileName
}
Explanation: Each dataflow within the filtered workspace is exported as a JSON file. The script constructs the filename using the dataflow’s name and saves it to the previously prepared backup directory.
- Modifying and Saving JSON
$JSONFile = Get-Content -Path $ExportFileName | ConvertFrom-Json
$JSONFile.’pbi:mashup’.allowNativeQueries = $false
$JSONToWrite = $JSONFile | ConvertTo-Json -Depth 100
Set-Content -Path $ExportFileName -Value $JSONToWrite
Explanation: After exporting the dataflow, the script modifies the JSON content. It sets ‘allowNativeQueries’ to ‘false’, enhancing security by preventing native queries from running. The JSON is then reconverted to a string and saved back to the file, ensuring that the exported dataflow reflects the desired configuration.
Full Script
Once you run the script in the Power Shell ISE you will see confirmation message like below image,

Restoring the Json file for Dataflow Gen2
Many may assume that JSON files are not suitable for Dataflow Gen2. However, there is an effective workaround available. Here’s how you can restore a JSON file for use in a Gen2 dataflow:
Step 1: Begin by exporting the dataflow to a Gen1 format. This step ensures compatibility with the tools and processes used for extraction.
Step 2: From the exported Gen1 dataflow, extract the Power Query template. This template contains the essential logic and transformations needed for your dataflow.
Step 3: Import the extracted Power Query template into your Gen2 dataflow. This step bridges the gap between Gen1 and Gen2, allowing you to leverage the existing logic in the newer environment.
Step 4: Once the template is imported, enable native queries and configure the data destination. This final step ensures that your Gen2 dataflow is fully functional and ready for use.
This approach ensures compatibility and functionality within the Gen2 environment, even when working with JSON files.


Conclusion
Automating the backup and versioning of Dataflow Gen2 instances in Power BI Service is not just a good practice—it is a necessity for maintaining the integrity and reliability of your data. The PowerShell script discussed in this article streamlines this process, allowing you to keep daily versions without the hassle of manual intervention. By ensuring that every version is securely backed up, you can confidently manage your Power BI environment and respond swiftly to any data issues that may arise.