Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)!
Last updated: 26-Apr-2025
         
       | 
    
| Enterprise Content Publishing [2] | 
[Microsoft Fabric] Power BI Environments
- {def} structured spaces within Microsoft Fabric that helps organizations manage the Power BI assets through the entire lifecycle
 - {environment} development
 - allows to develop the solution
 - accessible only to the development team
 - via Contributor access
 - {recommendation} use Power BI Desktop as local development environment
 - {benefit} allows to try, explore, and review updates to reports and datasets
 - once the work is done, upload the new version to the development stage
 - {benefit} enables collaborating and changing dashboards
 - {benefit} avoids duplication
 - making online changes, downloading the .pbix file, and then uploading it again, creates reports and datasets duplication
 - {recommendation} use version control to keep the .pbix files up to date
 - [OneDrive] use Power BI's autosync
 - {alternative} SharePoint Online with folder synchronization
 - {alternative} GitHub and/or VSTS with local repository & folder synchronization
 - [enterprise scale deployments]
 - {recommendation} separate dataset from reports and dashboards’ development
 - use the deployment pipelines selective deploy option [22]
 - create separate .pbix files for datasets and reports [22]
 - create a dataset .pbix file and uploaded it to the development stage (see shared datasets [22]
 - create .pbix only for the report, and connect it to the published dataset using a live connection [22]
 - {benefit} allows different creators to separately work on modeling and visualizations, and deploy them to production independently
 - {recommendation} separate data model from report and dashboard development
 - allows using advanced capabilities
 - e.g. source control, merging diff changes, automated processes
 - separate the development from test data sources [1]
 - the development database should be relatively small [1]
 - {recommendation} use only a subset of the data [1]
 - ⇐ otherwise the data volume can slow down the development [1]
 - {environment} user acceptance testing (UAT)
 - test environment that within the deployment lifecycle sits between development and production
 - it's not necessary for all Power BI solutions [3]
 - allows to test the solution before deploying it into production
 - all tests must have
 - View access for testing
 - Contributor access for report authoring
 - involves business users who are SMEs
 - provide approval that the content
 - is accurate
 - meets requirements
 - can be deployed for wider consumption
 - {recommendation} check report’s load and the interactions to find out if changes impact performance [1]
 - {recommendation} monitor the load on the capacity to catch extreme loads before they reach production [1]
 - {recommendation} test data refresh in the Power BI service regularly during development [20]
 - {environment} production
 - {concept} staged deployment
 - {goal} help minimize risk, user disruption, or address other concerns [3]
 - the deployment involves a smaller group of pilot users who provide feedback [3]
 - {recommendation} set production deployment rules for data sources and parameters defined in the dataset [1]
 - allows ensuring the data in production is always connected and available to users [1]
 - {recommendation} don’t upload a new .pbix version directly to the production stage
 - ⇐ without going through testing
 - {feature|preview} deployment pipelines
 - enable creators to develop and test content in the service before it reaches the users [5]
 - {recommendation} build separate databases for development and testing
 - helps protect production data [1]
 - {recommendation} make sure that the test and production environment have similar characteristics [1]
 - e.g. data volume, sage volume, similar capacity
 - {warning} testing into production can make production unstable [1]
 - {recommendation} use Azure A capacities [22]
 - {recommendation} for formal projects, consider creating an environment for each phase
 - {recommendation} enable users to connect to published datasets to create their own reports
 - {recommendation} use parameters to store connection details
 - e.g. instance names, database names
 - ⇐ deployment pipelines allow configuring parameter rules to set specific values for the development, test, and production stages
 - alternatively data source rules can be used to specify a connection string for a given dataset
 - {restriction} in deployment pipelines, this isn't supported for all data sources
 - {recommendation} keep the data in blob storage under the 50k blobs and 5GB data in total to prevent timeouts [29]
 - {recommendation} provide data to self-service authors from a centralized data warehouse [20]
 - allows to minimize the amount of work that self-service authors need to take on [20]
 - {recommendation} minimize the use of Excel, csv, and text files as sources when practical [20]
 - {recommendation} store source files in a central location accessible by all coauthors of the Power BI solution [20]
 - {recommendation} be aware of API connectivity issues and limits [20]
 - {recommendation} know how to support SaaS solutions from AppSource and expect further data integration requests [20]
 - {recommendation} minimize the query load on source systems [20]
 - use incremental refresh in Power BI for the dataset(s)
 - use a Power BI dataflow that extracts the data from the source on a schedule
 - reduce the dataset size by only extracting the needed amount of data
 - {recommendation} expect data refresh operations to take some time [20]
 - {recommendation} use relational database sources when practical [20]
 - {recommendation} make the data easily accessible [20]
 - [knowledge area] knowledge transfer
 - {recommendation} maintain a list of best practices and review it regularly [24]
 - {recommendation} develop a training plan for the various types of users [24]
 - usability training for read only report/app users [24
 - self-service reporting for report authors & data analysts [24]
 - more elaborated training for advanced analysts & developers [24]
 - [knowledge area] lifecycle management
 - consists of the processes and practices used to handle content from its creation to its eventual retirement [6]
 - {recommendation} postfix files with 3-part version number in Development stage [24]
 - remove the version number when publishing files in UAT and production
 - {recommendation} backup files for archive
 - {recommendation} track version history
 
[3] Microsoft Learn (2024) Deploy to Power BI [link]
[4] Microsoft Learn (2024) Power BI implementation planning: Content lifecycle management [link]
[5] Microsoft Learn (2024) Introduction to deployment pipelines [link]
API - Application Programming Interface
CLM - Content Lifecycle Management
COE - Center of Excellence
SaaS - Software-as-a-Service
SME - Subject Matter Expert
UAT - User Acceptance Testing
VSTS - Visual Studio Team System
SME - Subject Matter Experts



