Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)!
Last updated: 26-Apr-2025
[Microsoft Fabric] Deployment Pipelines
- {def} a structured process that enables content creators to manage the lifecycle of their organizational assets [5]
- enable creators to develop and test content in the service before it reaches the users [5]
- can simplify the deployment process to development, test, and production workspaces [5]
- one Premium workspace is assigned to each stage [5]
- each stage can have
- different configurations [5]
- different databases or different query parameters [5]
- {action} create pipeline
- from the deployment pipelines entry point in Fabric [5]
- creating a pipeline from a workspace automatically assigns it to the pipeline [5]
- {action} define how many stages it should have and what they should be called [5]
- {default} has three stages
- e.g. Development, Test, and Production
- the number of stages can be changed anywhere between 2-10
- {action} add another stage,
- {action} delete stage
- {action} rename stage
- by typing a new name in the box
- {action} share a pipeline with others
- users receive access to the pipeline and become pipeline admins [5]
- ⇐ the number of stages are permanent [5]
- can't be changed after the pipeline is created [5]
- {action} add content to the pipeline [5]
- done by assigning a workspace to the pipeline stage [5]
- the workspace can be assigned to any stage [5]
- {action|optional} make a stage public
- {default} the final stage of the pipeline is made public
- a consumer of a public stage without access to the pipeline sees it as a regular workspace [5]
- without the stage name and deployment pipeline icon on the workspace page next to the workspace name [5]
- {action} deploy to an empty stage
- when finishing the work in one pipeline stage, the content can be deployed to the next stage [5]
- deployment can happen in any direction [5]
- {option} full deployment
- deploy all content to the target stage [5]
- {option} selective deployment
- allows select the content to deploy to the target stage [5]
- {option} backward deployment
- deploy content from a later stage to an earlier stage in the pipeline [5]
- {restriction} only possible when the target stage is empty [5]
- {action} deploy content between pages [5]
- content can be deployed even if the next stage has content
- paired items are overwritten [5]
- {action|optional} create deployment rules
- when deploying content between pipeline stages, allow changes to content while keeping some settings intact [5]
- once a rule is defined or changed, the content must be redeployed
- the deployed content inherits the value defined in the deployment rule [5]
- the value always applies as long as the rule is unchanged and valid [5]
- {feature} deployment history
- allows to see the last time content was deployed to each stage [5]
- allows to to track time between deployments [5]
- {concept} pairing
- {def} the process by which an item in one stage of the deployment pipeline is associated with the same item in the adjacent stage
- applies to reports, dashboards, semantic models
- paired items appear on the same line in the pipeline content list [5]
- ⇐ items that aren't paired, appear on a line by themselves [5]
- the items remain paired even if their name changes
- items added after the workspace is assigned to a pipeline aren't automatically paired [5]
- ⇐ one can have identical items in adjacent workspaces that aren't paired [5]
- [lakehouse]
- can be removed as a dependent object upon deployment [3]
- supports mapping different Lakehouses within the deployment pipeline context [3]
- {default} a new empty Lakehouse object with same name is created in the target workspace [3]
- ⇐ if nothing is specified during deployment pipeline configuration
- notebook and Spark job definitions are remapped to reference the new lakehouse object in the new workspace [3]
- {warning} a new empty Lakehouse object with same name still is created in the target workspace [3]
- SQL Analytics endpoints and semantic models are provisioned
- no object inside the Lakehouse is overwritten [3]
- updates to Lakehouse name can be synchronized across workspaces in a deployment pipeline context [3]
- [notebook] deployment rules can be used to customize the behavior of notebooks when deployed [4]
- e.g. change notebook's default lakehouse [4]
- {feature} auto-binding
- binds the default lakehouse and attached environment within the same workspace when deploying to next stage [4]
- [environment] custom pool is not supported in deployment pipeline
- the configurations of Compute section in the destination environment are set with default values [6]
- ⇐ subject to change in upcoming releases [6]
- [warehouse]
- [database project] ALTER TABLE to add a constraint or column
- {limitation} the table will be dropped and recreated when deploying, resulting in data loss
- {recommendation} do not create a Dataflow Gen2 with an output destination to the warehouse
- ⇐ deployment would be blocked by a new item named DataflowsStagingWarehouse that appears in the deployment pipeline [10]
- SQL analytics endpoint is not supported
- [Eventhouse]
- {limitation} the connection must be reconfigured in destination that use Direct Ingestion mode [8]
- [EventStream]
- {limitation} limited support for cross-workspace scenarios
- {recommendation} make sure all EventStream destinations within the same workspace [8]
- KQL database
- applies to tables, functions, materialized views [7]
- KQL queryset
- ⇐ tabs, data sources [7]
- [real-time dashboard]
- data sources, parameters, base queries, tiles [7]
- [SQL database]
- includes the specific differences between the individual database objects in the development and test workspaces [9]
- can be also used with
- Copy Job
- dataflows gen2
- data pipeline
- mirrored database
- activator
- Power BI items
References:
[1] Microsoft Learn (2024) Get started with deployment pipelines [link]
[2] Microsoft Learn (2024) Implement continuous integration and
continuous delivery (CI/CD) in Microsoft Fabric [link]
[3] Microsoft Learn (2024) Lakehouse deployment pipelines and git
integration (Preview) [link]
[4] Microsoft Learn (2024) Notebook source control and deployment [link]
[5] Microsoft Learn (2024) Introduction to deployment pipelines [link]
[6] Environment Git integration and deployment pipeline [link]
[7] Microsoft Learn (2024) Microsoft Learn (2024) Real-Time Intelligence: Git
integration and deployment pipelines (Preview) [link]
[8] Microsoft Learn (2024) Eventstream CI/CD - Git Integration and
Deployment Pipeline [link]
[9] Microsoft Learn (2024) Get started with deployment pipelines
integration with SQL database in Microsoft Fabric [link]
[10] Microsoft Learn (2025) Source control with Warehouse (preview) [link]
Resources:
Acronyms:
CLM - Content Lifecycle Management
UAT - User Acceptance Testing
CLM - Content Lifecycle Management
UAT - User Acceptance Testing
No comments:
Post a Comment