Showing posts with label security. Show all posts
Showing posts with label security. Show all posts

28 March 2025

🏭🗒️Microsoft Fabric: OneLake Role-Based Access Control (RBAC) [Notes]

Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)! 

Last updated: 28-Mar-2025

[Microsoft Fabric] OneLake Role-based access control (RBAC)

  • {def} security framework that allows to manage access to resources by assigning roles to users or groups 
    • applies to Lakehouse Items only [1]
    • restricts data access for users with Workspace Viewer or read access to a lakehouse [1]
    • doesn't apply to Workspace Admins, Members, or Contributors [1]
      • ⇒ supports only Read level of permissions [1]
    • uses role assignments to apply permissions to its members
      • assigned to 
        • individuals
        • security groups
        • Microsoft 365 groups
        • distribution lists
        • ⇐ every member of the user group gets the assigned role [1]
      • users in multiple groups get the highest level of permission that is provided by the roles [1]
    • managed through the lakehouse data access settings [1]
    • when a lakehouse is created, OneLake generates a default RBAC Role named Default Readers [1]
      • allows all users with ReadAll permission to read all folders in the Item [1]
    • permissions always inherit to the entire hierarchy of the folder's files and subfolders [1]
    • provides automatic traversal of parent items to ensure that data is easy to discover [1]
      • ⇐ similar to Windows folder permissions [1]
      • [shortcuts] shortcuts to other OneLake locations have specialized behavior [1]
        • the access to a OneLake shortcut is determined by the target permissions of the shortcut [1]
          • when listing shortcuts, no call is made to check the target access [1]
            • ⇒ when listing a directory all internal shortcuts will be returned regardless of a user's access to the target [1]
              • when a user tries to open the shortcut the access check will evaluate and a user will only see data they have the required permissions to see [1]
    •  enable you to restrict the data access in OneLake only to specific folders [1]
  • {action} share a lakehouse
    • grants other users or a group of users access to a lakehouse without giving access to the workspace and the rest of its items [1]
    • found through 
      • Data Hub 
      • 'Shared with Me' section in Microsoft Fabrics
  • [shortcuts] permissions always inherit to all Internal shortcuts where a folder is defined as target [1]
    • when a user accesses data through a shortcut to another OneLake location, the identity of the calling user is used to authorize access to the data in the target path of the shortcut [1]
      • ⇒ the user must have OneLake RBAC permissions in the target location to read the data [1]
      • defining RBAC permissions for the internal shortcut is not allowed [1]
        • must be defined on the target folder located in the target item [1]
        • OneLake enables RBAC permissions only for shortcuts targeting folders in lakehouse items [1]

Previous Post <<||>> Next Post

References:
[1] Microsoft Learn (2024) Fabric: Role-based access control (RBAC) [link]
[2] Microsoft Learn (2024) Best practices for OneLake security [link]

Resources:
[R1] Microsoft Learn (2025) Fabric: What's new in Microsoft Fabric? [link]

Acronyms:
ADLS - Azure Data Lake Storage
RBAC - Role-Based Access Control

25 March 2025

🏭🗒️Microsoft Fabric: Security in Warehouse [Notes]

Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)! 

Last updated: 25-Mar-2024

[Microsoft Fabric] Security in Warehouse
  • {def} suite of technologies aimed at safeguarding sensitive information in Fabric [1]
    • leverages SQL engine’s security features [1]
      • allows for security mechanism at the warehouse level [1]
      • ⇐ the warehouse and SQL analytics endpoint items also allow for the defining of native SQL security [4]
        • the permissions configured only apply to the queries executed against the respective surfaces [4]
      • the access to OneLake data is controlled separately through OneLake data access roles [4]
        • {recommendation} to ensure users with SQL specific permissions don't see data they don't have SQL access to, don't include those users in a OneLake data access role [4]
    • supports a range of data protection features that enable administrators to shield sensitive data from unauthorized access [1]
      • ⇐ across warehouses and SQL analytics endpoints without necessitating changes to applications [1]
    • {type} object-level security (OLS)
      • permissions governing DML operations [1]
        • applies to tables and views
        • ⇐ when denied, the user will be prevented from performing the respective operation
        • SELECT
          • allows users to view the data within the object [1]
        • INSERT
          • allows users to insert data in the object [1]
        • UPDATE
          • allows users to update data within the object [1]
        • DELETE
          • allows users to delete the data within the object [1]
      • permissions can be granted, revoked or denied on database objects [1]
        •  tables and views
        • GRANT
          • permission is granted to user or role [1]
        • DENY
          • permissions is denied to user or role [1]
        • REVOKE
          • permissions is revoked to user or role [1]
        • ALTER
          • grants the user the ability to change the definition of the object [1]
        • CONTROL
          • grants the user all rights to the object [1]
      • {principle} least privilege
        • users and applications should only be given the permissions needed in order for them to complete the task
    • {type} column-level security (CLS)
      • allows to restrict column access to sensitive data [1]
        • provides granular control over who can access specific pieces of data [1]
          •  enhances the overall security of the data warehouse [1]
      • steps
        • identify the sensitive columns [1]
        • define access roles [1]
        • assign roles to users [1]
        • implement access control [1]
          • restrict access to ta column based on user's role [1]
    • {type} row-level security (RLS)
      • provides granular control over access to rows in a table based on group membership or execution context [1]
        • using WHERE clause filters [1]
      • works by associating a function (aka security predicate) with a table [1]
        • defined to return true or false based on certain conditions [1]
          • ⇐ typically involving the values of one or more columns in the table [1]
          • when a user attempts to access data in the table, the security predicate function is invoked [1]
            • if the function returns true, the row is accessible to the user; otherwise, the row doesn't show up in the query results [1]
        • the predicate can be as simple/complex as required [1]
        • the process is transparent to the user and is enforced automatically by SQL Server
          • ⇐ ensures consistent application of security rules [1]
      • implemented in two main steps:
        • filter predicates 
          • an inline table-valued function that filters the results based on the predicate defined [1]
        • security policy
          • invokes an inline table-valued function to protect access to the rows in a table [1]
            • because access control is configured and applied at the warehouse level, application changes are minimal - if any [1]
            • users can directly have access to the tables and can query their own data [1]
      • {recommendation} create a separate schema for predicate functions, and security policies [1]
      • {recommendation} avoid type conversions in predicate functions [1]
      • {recommendation} to maximize performance, avoid using excessive table joins and recursion in predicate functions [1]
    • {type} dynamic data masking (DDM) 
      • allows to limits data exposure to nonprivileged users by obscuring sensitive data [1]
        • e.g. email addresses 
      • {benefit} enhance the security and manageability of the data [1]
      • {feature} real-time masking
        • when querying sensitive data, DDM applies dynamic masking to it in real time [1]
          • the actual data is never exposed to unauthorized users, thus enhancing the security of your data [1]
        • straightforward to implement [1]
        • doesn’t require complex coding, making it accessible for users of all skill levels [1]
        • {benefit} the data in the database isn’t changed when DDM is applied
          •   the actual data remains intact and secure, while nonprivileged users only see a masked version of the data [1]
      • {operation} define masking rule
        • set up at column level [1]
        • offers a suite of features [1]
          • comprehensive and partial masking capabilities [1]
          • supports several masking types
            • help prevent unauthorized viewing of sensitive data [1]
              • by enabling administrators to specify how much sensitive data to reveal [1]
                •   minimal effect on the application layer [1]
            • applied to query results, so the data in the database isn't changed 
              •   allows many applications to mask sensitive data without modifying existing queries  [1]
          • random masking function designed for numeric data [1]
        • {risk} unprivileged users with query permissions can infer the actual data since the data isn’t physically obfuscated [1]
      • {recommendation} DDM should be used as part of a comprehensive data security strategy [1]
        • should include
          • the proper management of object-level security with SQL granular permissions [1]
          • adherence to the principle of minimal required permissions [1]
    • {concept} Dynamic SQL 
      • allows T-SQL statements to be generated within a stored procedure or a query itself [1]
        • executed via sp_executesql stored procedure
      • {risk} SQL injection attacks
        • use  QUOTENAME to sanitize inputs [1]
  • write access to a warehouse or SQL analytics endpoint
    • {approach} granted through the Fabric workspace roles
      • the role automatically translates to a corresponding role in SQL that grants equivalent write access [4]
      • {recommendation} if a user needs write access to all warehouses and endpoints, assign the user to a workspace role [4]
        • use the Contributor role unless the user needs to assign other users to workspace roles [4]
      • {recommendation} grant direct access through SQL permissions if the user only needs to write to specific warehouses or endpoints [4]
    • {approach} grant read access to the SQL engine, and grant custom SQL permissions to write to some or all the data [4]
  • write access to a warehouse or SQL analytics endpoint
    • {approach} grant read access through the ReadData permission, granted as part of the Fabric workspace roles [4]
      •  ReadData permission maps the user to a SQL role that gives SELECT permissions on all tables in the warehouse or lakehouse
        • helpful if the user needs to see all or most of the data in the lakehouse or warehouse [4]
        • any SQL DENY permissions set on a particular lakehouse or warehouse still apply and limit access to tables [4]
        • row and column level security can be set on tables to restrict access at a granular level [4]
    • {approach} grant read access to the SQL engine, and grant custom SQL permissions to read to some or all the data [4]
    • if the user needs access only to a specific lakehouse or warehouse, the share feature provides access to only the shared item [4]
      • during the share, users can choose to give only Read permission or Read + ReadData 
        • granting Read permission allows the user to connect to the warehouse or SQL analytics endpoint but gives no table access [4]
        • granting users the ReadData permissions gives them full read access to all tables in the warehouse or SQL analytics endpoint
      • ⇐ additional SQL security can be configured to grant or deny access to specific tables [4]

    References:
    [1] Microsoft Learn (2024) Secure a Microsoft Fabric data warehouse [link]
    [2] Data Mozart (2025) Lock Up! Understanding Data Access Options in Microsoft Fabric, by Nikola Ilic [link]
    [3] Microsoft Learn (2024) Security in Microsoft Fabric [link]
    [4] Microsoft Learn (2024) Microsoft Fabric: How to secure a lakehouse for Data Warehousing teams [link]

    Resources:
    [R1] Microsoft Learn (2025) Fabric: What's new in Microsoft Fabric? [link]
    [R2] Microsoft Learn (2025) Fabric: Security for data warehousing in Microsoft Fabric [link]
    [R3] Microsoft Learn (2025) Fabric: Share your data and manage permissions [link]

    Acronyms:
    CLS - Column-Level Security
    DDM - Dynamic Data Masking
    DML - Data Manipulation Language 
    MF - Microsoft Fabric
    OLS - Object-Level Security
    RLS - Row-Level Security
    SQL - Structured Query Language

    🏭🗒️Microsoft Fabric: Security [Notes]

    Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)! 

    Last updated: 25-Mar-2024

    Microsoft Fabric Security
    Microsoft Fabric Security [2]
    [Microsoft Fabric] Security
    • {def} a comprehensive security framework designed for the Microsoft Fabric platform [1]
      • {goal} always on 
        • every interaction with Fabric is encrypted by default and authenticated using Microsoft Entra ID [1]
          • all communication between Fabric experiences travels through the Microsoft backbone internet [1]
          • data at rest is automatically stored encrypted [1]
          • support for extra security features [1]
            • ⇐ allow to regulate access to Fabric [1]
            • Private Links 
              • enable secure connectivity to Fabric by 
                • restricting access to the Fabric tenant from an Azure VPN
                • blocking all public access
              • ensures that only network traffic from that VNet is allowed to access Fabric features [1]
            • Entra Conditional Access 
          • the connection to data is protected by a firewall or a private network using trusted access [1]
            • access firewall enabled ADL Gen2 accounts securely [1]
              • can be limited to specific workspaces [1]
                • workspaces that have a workspace identity can securely access ADL Gen 2 accounts with public network access enabled, from selected virtual networks and IP addresses [1]
              • workspace identities can only be created in workspaces associated with a Fabric F SKU capacity [1]
          • helps users connect to services quickly and easily from any device and any network [1]
            • each request to connect to Fabric is authenticated with Microsoft Entra ID [1]
              • allows users to safely connect to Fabric from their corporate office, when working at home, or from a remote location [1]
          • {feature} Conditional Access
            • allows to secure access to Fabric on every connection by
              • defining a list of IPs for inbound connectivity to Fabric [1]
              • using MFA [1]
              • restricting traffic based on parameters such as country of origin or device type [1]
      • {goal} compliant
        • data sovereignty provided out-of-box with multi geo capacities [1]
        • support for a wide range of compliance standards [1]
        • Fabric services follow the SDL)
          • a set of strict security practices that support security assurance and compliance requirements [2]
          • helps developers build more secure software by reducing the number and severity of vulnerabilities in software, while reducing development cost [2]
      • {goal} governable
        • leverages a set of governance tools
          • data lineage
          • information protection labels
          • data loss prevention 
          • Purview integration 
      • configurable
        •  in accordance with organizational policies [1]
      • evolving 
        • new features and controls are added regularly [1]
    • {feature} managed private endpoints 
      • allow secure connections to data sources without exposing them to the public network or requiring complex network configurations [1]
        • e.g. as Azure SQL databases
    • {feature} managed virtual networks
      • virtual networks that are created and managed by Microsoft Fabric for each Fabric workspace [1]
      • provide network isolation for Fabric Spark workloads
        • the compute clusters are deployed in a dedicated network and are no longer part of the shared virtual network [1]
      • enable network security features
        • managed private endpoints
        • private link support
    • {feature} data gateway
      • allows to connect to on-premises data sources or a data source that might be protected by a firewall or a virtual network
      • {option} On-premises data gateway
        • acts as a bridge between on-premises data sources and Fabric 1[]
        • installed on a server within the network [1]
        • allows Fabric to connect to data sources through a secure channel without the need to open ports or make changes to the network [1]
      • {option} Virtual network (VNet) data gateway
        • allows to connect from Microsoft Cloud services to Azure data services within a VNet, without the need of an on-premises data gateway [1]
    • {feature} Azure service tags
      • allows to ingest data from data sources deployed in an Azure virtual network without the use of data gateways [1]
        • e.g. VMs, Azure SQL MI and REST APIs
      • can be used to get traffic from a virtual network or an Azure firewall
        • e.g. outbound traffic to Fabric so that a user on a VM can connect to Fabric SQL connection strings from SSMS, while blocked from accessing other public internet resources [1]
    • {feature} IP allow-lists
      • allows to enable an IP allow-list on organization's network to allow traffic to and from Fabric
      • useful for data sources that don't support service tags [1]
        • e.g. on-premises data sources
    • {feature} Telemetry
      • used to maintain performance and reliability of the Fabric platform [2]
      • the telemetry store is designed to be compliant with data and privacy regulations for customers in all regions where Fabric is available [2]
    • {process} authentication
      • relies on Microsoft Entra ID to authenticate users (or service principals) [2]
      • when authenticated, users receive access tokens from Microsoft Entra ID [2]
        • used to perform operations in the context of the user [2]
      • {feature} conditional access
        • ensures that tenants are secure by enforcing multifactor authentication [2]
          • allows only Microsoft Intune enrolled devices to access specific services [1] 
        • restricts user locations and IP ranges.
    • {process} authorization
      • all Fabric permissions are stored centrally by the metadata platform
        • Fabric services query the metadata platform on demand to retrieve authorization information and to authorize and validate user requests [2]
      • authorization information is sometimes encapsulated into signed tokens [2]
        • only issued by the back-end capacity platform [1]
        • include the access token, authorization information, and other metadata [1]
    • {concept} tenant metadata 
      • information about the tenant 
      • is stored in a metadata platform cluster to which the tenant is assigned
        • located in a single region that meets the data residency requirements of that region's geography [2]
        • include customer data 
        • customers can control where their workspaces are located
          • in the same geography as the metadata platform cluster
            • by explicitly assigning workspaces on capacities in that region [2]
            • by implicitly using Fabric Trial, Power BI Pro, or Power BI Premium Per User license mode [2]
              • all customer data is stored and processed in this single geography [2]
          • in Multi-Geo capacities located in geographies (geos) other than their home region [2]
            • compute and storage is located in the multi-geo region [2]
              • (including OneLake and experience-specific storage [2]
            • {exception} the tenant metadata remains in the home region
            • customer data will only be stored and processed in these two geographies [2]
    • {concept} data-at-rest
      • all Fabric data stores are encrypted at rest [2]
        • by using Microsoft-managed keys
        • includes customer data as well as system data and metadata [2]
        •  data is never persisted to permanent storage while in an unencrypted state [1]
          • data can be processed in memory in an unencrypted state [2]
      • {default} encrypted using platform managed keys (PMK)
        • Microsoft is responsible for all aspects of key management [2]
        • data-at-rest on OneLake is encrypted using its keys [3]
        • {alternative} Customer-managed keys (CMK) 
          • allow to encrypt data at-rest using customer keys [3]
            •   customer assumes full control of the key [3]
          • {recommendation} use cloud storage services with CMK encryption enabled and access data from Fabric using OneLake shortcuts [3]
            • data continues to reside on a cloud storage service or an external storage solution where encryption at rest using CMK is enabled [3]
            • customers can perform in-place read operations from Fabric whilst staying compliant [3] 
            • shortcuts can be accessed by other Fabric experiences [3]
    • {concept} data-in-transit
      • refers to traffic between Microsoft services routed over the Microsoft global network [2]
      • inbound communication
        • always encrypted with at least TLS 1.2. Fabric negotiates to TLS 1.3 whenever possible [2]
        • inbound protection
          •  concerned with how users sign in and have access to Fabric [3]
      • outbound communication to customer-owned infrastructure 
        • adheres to secure protocols [2]
          • {exception} might fall back to older, insecure protocols when newer protocols aren't supported [2]
            • incl. TLS 1
        • outbound protection
          • concerned with securely accessing data behind firewalls or private endpoints [3]


    References:
    [1] Microsoft Learn (2024) Security in Microsoft Fabric [link]
    [2] Microsoft Learn (2024) Microsoft Fabric security fundamentals [link]
    [3] Microsoft Learn (2024) Microsoft Fabric end-to-end security scenario [link]

    Resources:
    [R1] Microsoft Learn (2024) Microsoft Fabric security [link]
    [R2] Microsoft Learn (2025) Fabric: What's new in Microsoft Fabric? [link]

    Acronyms:
    ADL - Azure Data Lake
    API - Application Programming Interface
    CMK - Customer-Managed Keys
    MF - Microsoft Fabric
    MFA - Multifactor Authentication 
    MI - Managed Instance 
    PMK - Platform-Managed Keys
    REST - REpresentational State Transfer
    SDL - Security Development Lifecycle
    SKU - Stock Keeping Unit
    TLS  - Transport Layer Security
    VM - Virtual Machine
    VNet - virtual network
    VPN - Virtual Private Network

    15 March 2025

    💠🛠️🗒️SQL Server: Schemas [Notes]

    Disclaimer: This is work in progress based on notes gathered over the years, intended to consolidate information from the various sources. 

    Last updated: 15-Mar-2024

    [SQL Server 2005] Schemas

    • {def} a collection of database objects that are owned by a single user and form a single namespace
      • a named container for database objects
        • allows to group objects into separate namespaces
        • collection of like objects which provide maintenance and security to those objects as a whole, without affecting objects within other schemas [1]
      • reside within databases
      • fulfilling a common purpose [1]
      • each schema can contain zero or more data structures (aka objects) [1]
      • all objects within a schema share 
        • a common naming context
        • a common security context [10]
      • behavior of schema changed 
        • ⇐ compared to SQL Server 2000
        • schemas are no longer equivalent to database users
          • each schema is a distinct namespace that exists independently of the database user who created it
            • used as a prefix to the object name
            • schema is simply a container of objects [3]
          • code written for earlier releases of SQL Server may return incorrect results, if the code assumes that schemas are equivalent to database users [3]
      • can be owned by any database principal
        • this includes roles and application roles  [3]
        • its ownership is transferable [3]
        • every object is contained by a schema [6]
        • anything contained by it has the same owner [6]
      • separation of ownership [3]
        • ownership of schemas and schema-scoped securables is transferable [3]
        • objects can be moved between schemas [3]
        • a single schema can contain objects owned by multiple database users  [3]
        • multiple database users can share a single default schema  [3]
        • permissions on schemas and schema-contained securables can be managed with greater precision than in earlier releases  [3]
        • each user has a default schema [3]
        • user’s default schema is used for name resolution during object creation or object reference [7]
          •  {warning} a user might not have permission to create objects in the dbo schema, even if that is the user’s default schema [7]
          • when a login in the sysadmin role creates an object with a single part name, the schema is always dbo [7]
          • a database user can be dropped without dropping objects in a corresponding schema  [3]
          • catalog views designed for earlier releases of SQL Server may return incorrect results
            • ⇐ includes sysobjects
            • more than 250 new catalog views were introduced to reflect the changes
          •  when creating a database object, if you specify a valid domain principal (user or group) as the object owner, the domain principal will be added to the database as a schema. The new schema will be owned by that domain principal [3]
      • schema-qualified object name (aka two-part object name)
        • if schema is omitted, a schema resolution is performed (aka implicit resolution)
          • checks whether the object exists in the user's default schema
          • if it doesn't, checks whether it exists in the dbo schema [5]
            • extra costs are involved in resolving the object name (aka name resolution) [5]
              • uses a spinlock [8]
                • in rare occasions a spinlock could not be acquired immediately on such an operation
                  • this may occur on a system under significant load [8]
                  • the contention appears on the SOS_CACHESTORE spinlock type [8]
                  • {resolution} ensure that you always fully qualify your table names [8]
              • if multiple objects with the same name exist in different schemas, the wrong object might be retrieved [5]
          • improves readability
        • {recommendation} always use two-part object names in queries (aka schema-qualify objects) 
        • {poor practice} partition data and objects by using only schemas 
          •  instead of creating multiple databases [1]
        • {poor practice} complex schemas
          • developing a row-based security schema for an entire database using dozens or hundreds of views can create maintenance issues [6]
    • {benefit} simplify database object management
      • groups of tables can be managed from a single point [4]
        • by creation of categories of tables [4]
      • helps navigation through database [4]
      • allow control permissions at schema level 
    • {benefit} provide separation of ownership 
      • allows to manage user permissions at the schema level, and then enhance them or override them at the object level as appropriate [10]
      • {recommendation} manage database object security by using ownership and permissions at the schema level [2]
      • {recommendation} have distinct owners for schemas or use a user without a login as a schema owner [2]
      • {recommendation} not all schemas should be owned by dbo [2]
      • {recommendation} minimize the number of owners for each schema [2]
    • {benefit} enhance security
      • by minimizing the risk of SQL injection
        • by assigning objects to schema it is possible to drop users without rewriting your applications as the name resolution is no longer depend upon the user or principals names 
      • used as an extra hierarchical layer for solution and security management [1]
        • gives architects and developers the ability to choose between the types of logical separation of objects they have created, as well as benefit from having a combination of multiple databases and multiple schemas within them [1]
    • {type} system schemas
      • can't be dropped
      • [default schema] dbo
        • included in each database 
        • if an application needs to  create objects in the under the dbo schema then by granting dbo privileges to the application [12]
          • increases the attack surface of the application [12]
          • increases the severity if the application is vulnerable to SQL Injection attacks [12]
        • can be set and changed by using DEFAULT_SCHEMA option of [3]
          • e.g. CREATE USER <user_name> WITH DEFAULT_SCHEMA = <schema_name>
          • e.g. ALTER USER <user_name> WITH DEFAULT_SCHEMA = <schema_name>
        • if DEFAULT_SCHEMA is left undefined, the database user will have dbo as its default schema [3]
          • [SQL Server 2005] Windows Groups are not allowed to have this property [11]
          • [SQL Server 2012] Windows Groups can also have a defined default schema [1]
            • streamlines the process of creating users
              • if no default schema is specified for a new user, instead is used the default schema of a group where the user is a member [9]
        • {warning} not to be confused with the dbo role [6]
      • INFORMATION_SCHEMA schema
        • an internal, system table-independent view of the SQL Server metadata
        • enable applications to work correctly although significant changes have been made to the underlying system tables
      • guest schema

      • sys schema 
        • provides a way to access all the system tables and views [7]
    • {type} user-defined schemas
      • {best practice} assign objects to user-defined schemas
        • leaving everything in the dbo schema is like putting everything in the root directory of your hard drive [8]
        • it saves the Query Processor a step from having to resolve the schema name out by itself [8]
          • avoid ambiguity
      • {best practice} assign each user a default schema
        •  ensures that if they create an object without specifying a schema, it will automatically go into their assigned container [8]
    • {type} role-based schemas
      • [SQL Server 2012] every fixed database role has a schema of the same name [7]
        • {exception} public role5
    • {action} create objects in schema
      • {prerequisite}
        • schema must exist
        • the user creating the object must have permission to create the object, either directly or through role membership [7]
        • the user creating the object must  either [7]
          • be the owner of the schema 
          • be a member of the role that owns the schema
          • have ALTER rights on the schema 
          • have the ALTER ANY SCHEMA permission in the database
      • {recommendation} group like objects together into the same schema [2]
    • {operation} create schema
      • {recommendation} use two-part names for database object creation and access [2]
    • {operation} change schema (aka modify schema)
      • when applying schema changes to an object and try to manipulate the object data in the same batch, SQL Server may not be aware of the schema changes yet and fail the data manipulation statement with a resolution error [5]
        • the parsing does not check any object names or schemas because a schema may change by the time the statement executes [6]
      • triggers a database lock
      • invalidates existing query plans
        • a new plan will need to be recompiled for the queries as soon as they are run anew
      • not allowed on
        • [SQL Server 2014] [memory-optimized tables]
        • [table variables]
      • {best practice} explicitly list column names in statements in case a schema changes 
    • {operation} schema dropping
    • [windows groups]
      • an exception in the SQL Server security model [11]
      • a secondary identity with additional capabilities that are traditionally reserved only for primary identities [11]
        • require handling not seen in any other security system [11]
      •  can simplify management but due to their hybrid nature, they come with some restrictions [11]
      • {recommendation} for users mapped to Windows groups, try and limit each Windows user to one Windows group that has database access [2]

    References:
    [1] 40074A: Microsoft SQL Server 2014 for Oracle DBAs, Microsoft, 2015 
    [2] Bob Beauchemin et al (2012) SQL Server 2012 Security Best Practices - Operational and Administrative Tasks [whitepaper]
    [3] MSDN (2005)User-Schema Separation [link]
    [4] Solid Quality Learning (2007) Microsoft SQL Server 2005: Database Essentials Step by Step
    [5] Itzik Ben-Gan (2008) Microsoft® SQL Server® 2008 T-SQL Fundamentals
    [6] Adam Jorgensen et al (2012) Microsoft® SQL Server® 2012 Bible
    [7] Kalen Delaney et al (2013) Microsoft SQL Server 2012 Internals
    [8] Buck Woody (2009) SQL Server Best Practices: User-Defined Schemas [link obsolet]
    [9] Microsoft (2014) 10977B: Updating Your SQL Server Skills to SQL Server 2014 (Trainer Handbook)
    [10] Microsoft (2012) 20465B Designing Database Solutions for SQL Server 2012 (Trainer Handbook
    [11] Laurentiu Cristofor (2008) SQL Server: Windows Groups, default schemas, and other properties [link]
    [12] Dan Sellers's WebLog (2006) Post Webcast’s Notes: Securing SQL Server 2005 for Developers  [link]
    [13] Microsoft Learn (2024) SQL Server 2022: System Information Schema Views (Transact-SQL)
    [14] 

    Acronyms:
    SQL - Structured Query Language

    💫🗒️ERP Systems: Microsoft Dynamics 365's Business Performance Analytics (BPA) [notes]

    Disclaimer: This is work in progress intended to consolidate information from the various sources and not to provide a complete overview of all the features. Please refer to the documentation for a complete overview!

    Last updated: 15-Mar-2025

    [Dynamics 365] Business Performance Analytics (BPA)

    • {def} centralized reporting hub within D365 F&O designed to streamline insights and help organizations make faster, data driven decisions [3]
      • solution designed to transform organization's data into actionable insights [1]
      • provides an intuitive out-of-box data model along with familiar tools like Microsoft Excel and Power BI for self-service analytics [4]
        • data extracted from D365 is classified in BPA in the form of value chains
          • ⇐ a group of business processes on top of the value chain [4]
    • {benefit} allows to simplify data insights by providing a unified view of business data across entities in near real time [4]
    • {benefit} allows to streamline financial and operations reporting to reduce the cycle times [4]
    • {benefit} allows users of all technical abilities to quickly access and analyze data to facilitate data driven decisions [4]
    • {benefit} provides auditors with direct access to financial data, making the audit process more efficient
    • {benefit} enables ease of use through familiar apps like Excel and Power BI, in addition to AI driven insights and automation in this platform that can be scalable and extendable [4]
    • {feature} extends into Microsoft Fabric
      • {benefit} provide a scalable, secure environment for handling large data sets and ensuring insights are always powered by the latest technology [3]
    • {feature} ETL process 
      • involves extracting data from finance and operations database, transforming and loading it into Dataverse [4]
        • each of the entities required for the generation of the dimensional model for the value chains that were mentioned earlier, they are backed by the underlying tables in finance and operations database [4]
      • installed in Dataverse, virtual  entities that are created will then pull in the data into the managed data lake [4]
      • the data is then transformed to generate the dimensional  model which is then pushed into the embedded Power BI workspace in the form of analytical tables [4]
      • BPA consumes this data from Power BI workspace to render the power BI reports [4]
      • this data can also be extended to Fabric if there is a need to consolidate data from multiple sources [4]
    • {feature} reports 
      • designed to provide a detailed overview of an organization's financial health [8]
      • further reports will be added to expand the coverage for the value chains [8]
      • out-of-box reports can't be modified
        • ⇐ users cannot rename, delete or edit these type of reports [8]
        • there’s the option to duplicate the base report and edit the version thus created [8]
      • can be shared with other users who have access to BPA 
        • ⇐ they can receive an in-app notification [8]
        • can be shared over email with another user by entering user’s email address [8] 
        • one can configure whether the recipient can edit or view the report [8]
      •   {feature} allows to create a new Power BI or Excel report from scratch [8]
        • {option} start with a blank report or duplicate an existing report [8]
    • {feature} data refresh
      • automatic data refreshes run currently two times a day [4]
        • at 12:00 AM and 12:00 PM UTC
        • the volume of data is also constrained by the storage capacity of the A3 SKU for Power BI Embedded [1]
          • future release, may support additional data reporting capacity [1]
            • ⇐ so that larger data sets can be reported and analyzed [1]
        • the target is to have refreshes every hour or less [3]
      • data volume will be initially for about eight quarters of data [4]
      • extensibility will be supported with bring your own Fabric [4]
    • architecture
      • SaaS solution
        • {capability} immediate deployment 
          • businesses can start to analyze data and generate insights with minimal setup [1]
        • {capability} comprehensive reporting and dashboards
          • provides access to a wide range of preconfigured reports that cover multiple business functions [1]
        • {capability} near-real-time analytics 
          • future releases will offer more frequent data refreshes to enable near-real-time data analysis and reporting
        • {capability} predictive insights 
          • future releases will introduce predictive analytics capabilities that enable businesses to 
            • forecast trends
            • identify risks
            • seize opportunities [1]
        • {capability} user-friendly interface 
          • intuitive design ⇒ minimal training
            • fosters broader adoption 
            • enables a data-driven culture across the organization [1]
        • {capability} cost-effectiveness
          • available as part of D365 license
            • ⇒ provides advanced analytics without requiring significant investments in IT infrastructure [1]
      • DaaS solution
        • {capability} organizations can integrate its data models with their existing data warehousing infrastructure in Microsoft Fabric [1]
          • maximizes the value of existing data solutions [1]
          • positions businesses for future enhancements [1]
        • {capability} unified and scalable data models
          • customers can build custom models on top of a unified framework
            • ensures consistency and scalability across data sets [1]
        • {capability} future-proofing with automatic upgrades
          • data models integrate seamlessly with future D365 updates
            • reduces manual maintenance and ensures access to the latest features [1]
        • {capability} consistency and standardization
          • data models provide consistency and standardization across data sources
            • ensure high data quality and integrity [1]
        • {capability} advanced analytics and AI 
          • by customizing the data models, organizations can take advantage of advanced analytics and AI capabilities [1]
            • deeper insights without having to develop them from scratch [1]
        • {capability} enhanced data governance
          • unified data models support better data governance by providing standardized data definitions, relationships, and hierarchies [1]
            • ensure consistency and quality across the organization [1]
      • requires an integrated Power Platform environment [5]
        • must be integrated with the Microsoft Entra tenant [5]
      • uses shared Dataverse entitlements [1]
        • includes access to the data lake [1]
    • setup
      • dimensions
        • the selection of dimensions might affect the dimension groups that are created using these dimensions and the users who are assigned there [7]
          • e.g. legal entity, business unit
      • dimension groups
        • users can select specific values for the legal entity, or add a range of values [7]
          • selecting an invalid combination of dimension values, the dimension group will filter out all the records on the report [7]
        • {warning} assigning too many dimension groups to a user, slows the load for that user [7]
      • roles
        • determine which reports the user can access [7]
    • security
      • secure data through role-based access control on top of the value chains [7]
      • the first user who signs into the app is assigned the BPA admin role [7]
        • allows a user to access the administrator section of the BPA [7]
          • where the security can be set up [7]
        • has automatically assigned 
          • Microsoft report viewer role 
          • the All Access Dimension group [7]
            • allow the admin to see the data  in all the reports across all the dimensions [7]
      • {feature} dimension-based role-level security
        • ensures that users only see the data relevant to them based on their role
          •  confidently share reports without duplicating them
            • ⇐ data is automatically filtered by organization's security policies [3]
        • simple but powerful way to maintain control while providing access for teams that love working in Excel [3]
    • accessibility
      • can be accessed through either 
        • Power Platform
          • admins can access BPA app through PowerApps' makeup portal [6]
        • Dynamics 365
          • through the BPA preview shortcut in the homepage or the default dashboard [6]
          • for end users, the BPA preview shortcut is provided when they have certain duties associated to their role(s) [6]
    • licensing
      • included in D365 F&O license [4]
    • requirements
      • requires a tier two environment and Dynamics 365 finance version 1.0.38 or later [5]
    • {project} timeline
      • [2025 wave 1] backup and restore custom reports and analytics
        • {benefit} support better lifecycle management and empower customers to develop on sandbox instances before publishing to production [3]
      • 2025: available in all regions where F&O is available [3]
      • Oct-2024: GA

    References:
    [1] Microsoft Learn (2024) Dynamics 365 Finance: What is Business performance analytics? [link]
    [2] Microsoft Learn (2025) Business performance analytics (BPA) with Dynamics 365 Finance [link]
    [3] Dynamics 365 Finance - Business Performance Analytics 2025 Release Wave 1 Release Highlights [link]
    [4] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 1 [link]
    [5] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 2 [link]
    [6] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 3 [link]
    [7] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 4 [link]   
    [8] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 5 [link]
    [9] Microsoft Learn (2024) Dynamics 365: Business performance analytics introduction [link

    Acronyms:
    AI - Artificial Intelligence
    BPA - Business Performance Analytics
    D365 F&O - Dynamics 365 for Finance and Operations
    DaaS - Data-as-a-Service
    ETL - Extract, Transfer, Load
    GA - General Availability
    MF - Microsoft Fabric
    PP - Public Preview
    SaaS - Software-as-a-Service
    SKU - Stock Keeping Unit
    UTC - Coordinated Universal Time

    22 January 2025

    🏭🗒️Microsoft Fabric: Folders [Notes]

    Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)! 

    Last updated: 22-Jan-2025

    [Microsoft Fabric] Folders

    • {def} organizational units inside a workspace that enable users to efficiently organize and manage artifacts in the workspace [1]
    • identifiable by its name
      • {constraint} must be unique in a folder or at the root level of the workspace
      • {constraint} can’t include certain special characters [1]
        • C0 and C1 control codes [1]
        • leading or trailing spaces [1]
        • characters: ~"#.&*:<>?/{|} [1]
      • {constraint} can’t have system-reserved names
        • e.g. $recycle.bin, recycled, recycler.
      • {constraint} its length can't exceed 255 characters
    • {operation} create folder
      • can be created in
        • an existing folder (aka nested subfolder) [1]
          • {restriction} a maximum of 10 levels of nested subfolders can be created [1]
          • up to 10 folders can be created in the root folder [1]
          • {benefit} provide a hierarchical structure for organizing and managing items [1]
        • the root
    • {operation} move folder
    • {operation} rename folder
      • same rules applies as for folders’ creation [1]
    • {operation} delete folder
      • {restriction} currently can be deleted only empty folders [1]
        • {recommendation} make sure the folder is empty [1]
    •  {operation} create item in in folder
      • {restriction} certain items can’t be created in a folder
        • dataflows gen2
        • streaming semantic models
        • streaming dataflows
      • ⇐ items created from the home page or the Create hub, are created at the root level of the workspace [1]
    • {operation} move file(s) between folders [1]
    • {operation} publish to folder [1]
      •   Power BI reports can be published to specific folders
        • {restriction} folders' name must be unique throughout an entire workspace, regardless of their location [1]
          • when publishing a report to a workspace that has another report with the same name in a different folder, the report will publish to the location of the already existing report [1]
    • {limitation}may not be supported by certain features
      •   e.g. Git
    • {recommendation} use folders to organize workspaces [1]
    • {permissions}
      • inherit the permissions of the workspace where they're located [1] [2]
      • workspace admins, members, and contributors can create, modify, and delete folders in the workspace [1]
      • viewers can only view folder hierarchy and navigate in the workspace [1]
    • [deployment pipelines] deploying items in folders to a different stage, the folder hierarchy is automatically applied [2]

    Previous Post  <<||>>  Next Post

    References:
    [1] Microsoft Fabric (2024) Create folders in workspaces [link]
    [2] Microsoft Fabric (2024) The deployment pipelines process [link]
    [3] Microsoft Fabric Updates Blog (2025) Define security on folders within a shortcut using OneLake data access roles [link]
    [4] Microsoft Fabric Updates Blog (2025) Announcing the General Availability of Folder in Workspace [link]
    [5] Microsoft Fabric Updates Blog (2025) Announcing Folder in Workspace in Public Preview [link]
    [6] Microsoft Fabric Updates Blog (2025) Getting the size of OneLake data items or folders [link]

    Resources:
    [R1] Microsoft Learn (2025) Fabric: What's new in Microsoft Fabric? [link]

    20 January 2025

    🏭🗒️Microsoft Fabric: [Azure] Service Principals (SPN) [Notes]

    Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)! 

    Last updated: 20-Jan-2025

    [Azure] Service Principal (SPN)  

    • {def} a non-human, application-based security identity used by applications or automation tools to access specific Azure resources [1]
      • can be assigned precise permissions, making them perfect for automated processes or background services
        • allows to minimize the risks of human error and identity-based vulnerabilities
        • supported in datasets, Gen1/Gen2 dataflows, datamarts [2]
        • authentication type 
          • supported only by [2]
            • Azure Data Lake Storage
            • Azure Data Lake Storage Gen2
            • Azure Blob Storage
            • Azure Synapse Analytics
            • Azure SQL Database
            • Dataverse
            • SharePoint online
          • doesn’t support
            • SQL data source with Direct Query in datasets [2]
    • when registering a new application in Microsoft Entra ID, a SPN is automatically created for the app registration [4]
      • the access to resources is restricted by the roles assigned to the SPN
        • ⇒ gives control over which resources can be accessed and at which level [4]
      • {recommendation} use SPN with automated tools [4]
        • rather than allowing them to sign in with a user identity  [4]
      • {prerequisite} an active Microsoft Entra user account with sufficient permissions to 
        • register an application with the tenant [4]
        • assign to the application a role in the Azure subscription [4]
        •  requires Application.ReadWrite.All permission [4]
    • extended to support Fabric Data Warehouses [1]
      • {benefit} automation-friendly API Access
        • allows to create, update, read, and delete Warehouse items via Fabric REST APIs using service principals [1]
        • enables to automate repetitive tasks without relying on user credentials [1]
          • e.g. provisioning or managing warehouses
          • increases security by limiting human error
        • the warehouses thus created, will be displayed in the Workspace list view in Fabric UI, with the Owner name of the SPN [1]
        • applicable to users with administrator, member, or contributor workspace role [3]
        • minimizes risk
          • the warehouses created with delegated account or fixed identity (owner’s identity) will stop working when the owner leaves the organization [1]
            • Fabric requires the user to login every 30 days to ensure a valid token is provided for security reasons [1]
      • {benefit} seamless integration with Client Tools: 
        • tools like SSMS can connect to the Fabric DWH using SPN [1]
        • SPN provides secure access for developers to 
          • run COPY INTO
            • with and without firewall enabled storage [1]
          • run any T-SQL query programmatically on a schedule with ADF pipelines [1]
      • {benefit} granular access control
        • Warehouses can be shared with an SPN through the Fabric portal [1]
          • once shared, administrators can use T-SQL commands to assign specific permissions to SPN [1]
            • allows to control precisely which data and operations an SPN has access to  [1]
              • GRANT SELECT ON <table name> TO <Service principal name>  
        • warehouses' ownership can be changed from an SPN to user, and vice-versa [3]
      • {benefit} improved DevOps and CI/CD Integration
        • SPN can be used to automate the deployment and management of DWH resources [1]
          •  ensures faster, more reliable deployment processes while maintaining strong security postures [1]
      • {limitation} default semantic models are not supported for SPN created warehouses [3]
        • ⇒ features such as listing tables in dataset view, creating report from the default dataset don’t work [3]
      • {limitation} SPN for SQL analytics endpoints is not currently supported
      • {limitation} SPNs are currently not supported for COPY INTO error files [3]
        • ⇐ Entra ID credentials are not supported as well [3]
      • {limitation} SPNs are not supported for GIT APIs. SPN support exists only for Deployment pipeline APIs [3]
      • monitoring tools
        • [DMV] sys.dm_exec_sessions.login_name column [3] 
        • [Query Insights] queryinsights.exec_requests_history.login_name [3]
        • Query activity
          • submitter column in Fabric query activity [3]
        • Capacity metrics app: 
          • compute usage for warehouse operations performed by SPN appears as the Client ID under the User column in Background operations drill through table [3]

    References:
    [1] Microsoft Fabric Updates Blog (2024) Service principal support for Fabric Data Warehouse [link]
    [2] Microsoft Fabric Learn (2024) Service principal support in Data Factory [link]
    [3] Microsoft Fabric Learn (2024) Service principal in Fabric Data Warehouse [link
    [4] Microsoft Fabric Learn (2024) Register a Microsoft Entra app and create a service principal [link]
    [5] Microsoft Fabric Updates Blog (2024) Announcing Service Principal support for Fabric APIs [link]

    Resources:
    [R1] Microsoft Learn (2025) Fabric: What's new in Microsoft Fabric? [link
     
    Acronyms:
    ADF - Azure Data Factory
    API - Application Programming Interface
    CI/CD - Continuous Integration/Continuous Deployment
    DMV - Dynamic Management View
    DWH - Data Warehouse
    SPN - service principal
    SSMS - SQL Server Management Studio

    17 January 2025

    💎🏭SQL Reloaded: Microsoft Fabric's SQL Databases (Part VIII: Permissions) [new feature]

    Data-based solutions usually target a set of users who (ideally) have restricted permissions to the functionality. Therefore, as part of the process are defined several personas that target different use cases, for which the permissions must be restricted accordingly. 

    In the simplest scenario the user must have access to the underlying objects for querying the data. Supposing that an Entra User was created already, the respective user must be given access also in the Fabric database (see [1], [2]). From database's main menu follow the path to assign read permissions:
    Security >> Manage SQL Security >> (select role: db_datareader)

    Manage SQL Security
    Manage SQL Security

    Manage access >> Add >> (search for User)

    Manage access
    Manage access

    (select user) >> Share database >> (select additional permissions) >> Save

    Manage additional permissions
    Manage additional permissions

    The easiest way to test whether the permissions work before building the functionality is to login over SQL Server Management Studio (SSMS) and check the access using the Microsoft Entra MFA. Ideally, one should have a User's credentials that can be used only for testing purposes. After the above setup was done, the new User was able to access the data. 

    A second User can be created for testing with the maximum of permissions allowed on the SQL database side, which is useful for troubleshooting. Alternatively, one can use only one User for testing and assign or remove the permissions as needed by the test scenario. 

    It's a good idea to try to understand what's happening in the background. For example, the expectation was that for the Entra User created above also a SQL user is created, which doesn't seem to be the case, at least per current functionality available. 

     Before diving deeper, it's useful to retrieve User's details: 

    -- retrieve current user
    SELECT SUser_Name() sys_user_name
    , User_Id() user_id 
    , USER_NAME() user_name
    , current_user [current_user]
    , user [user]; 
    
    Output:
    sys_user_name user_id user_name current_user user
    JamesClavell@[domain].onmicrosoft.com 0 JamesClavell@[domain].onmicrosoft.com JamesClavell@[domain].onmicrosoft.com JamesClavell@[domain].onmicrosoft.com

    Retrieving the current User is useful especially when testing in parallel functionality with different Users. Strangely, User's ID is 0 when only read permissions were assigned. However, a valid User identifier is added for example when to the User is assigned also the db_datawriter role. Removing afterwards the db_datawriter role to the User keeps as expected User's ID. For troubleshooting purposes, at least per current functionality, it might be a good idea to create the Users with a valid User ID (e.g. by assigning temporarily the db_datawriter role to the User). 

    The next step is to look at the Users with access to the database:

    -- database access 
    SELECT USR.uid
    , USR.name
    --, USR.sid 
    , USR.hasdbaccess 
    , USR.islogin
    , USR.issqluser
    --, USR.createdate 
    --, USR.updatedate 
    FROM sys.sysusers USR
    WHERE USR.hasdbaccess = 1
      AND USR.islogin = 1
    ORDER BY uid
    
    Output:
    uid name hasdbaccess islogin issqluser
    1 dbo 1 1 1
    6 CharlesDickens@[...].onmicrosoft.com 1 1 0
    7 TestUser 1 1 1
    9 JamesClavell@[...].onmicrosoft.com 1 1 0

    For testing purposes, besides the standard dbo role and two Entra-based roles, it was created also a SQL role to which was granted access to the SalesLT schema (see initial post):

    -- create the user
    CREATE USER TestUser WITHOUT LOGIN;
    
    -- assign access to SalesLT schema 
    GRANT SELECT ON SCHEMA::SalesLT TO TestUser;
      
    -- test impersonation (run together)
    EXECUTE AS USER = 'TestUser';
    
    SELECT * FROM SalesLT.Customer;
    
    REVERT; 

    Notes:
    1) Strangely, even if access was given explicitly only to the SalesLT schema, the TestUser User has access also to sys.sysusers and other DMVs. That's valid also for the access over SSMS
    2) For the above created User there are no records in the sys.user_token and sys.login_token DMVs, in contrast with the user(s) created for administering the SQL database. 

    Let's look at the permissions granted explicitly:

    -- permissions granted explicitly
    SELECT DPR.principal_id
    , DPR.name
    , DPR.type_desc
    , DPR.authentication_type_desc
    , DPE.state_desc
    , DPE.permission_name
    FROM sys.database_principals DPR
         JOIN sys.database_permissions DPE
    	   ON DPR.principal_id = DPE.grantee_principal_id
    WHERE DPR.principal_id != 0 -- removing the public user
    ORDER BY DPR.principal_id
    , DPE.permission_name;
    
    Result:
    principal_id name type_desc authentication_type_desc state_desc permission_name
    1 dbo SQL_USER INSTANCE GRANT CONNECT
    6 CharlesDickens@[...].onmicrosoft.com EXTERNAL_USER EXTERNAL GRANT AUTHENTICATE
    6 CharlesDickens@[...].onmicrosoft.com EXTERNAL_USER EXTERNAL GRANT CONNECT
    7 TestUser SQL_USER NONE GRANT CONNECT
    7 TestUser SQL_USER NONE GRANT SELECT
    9 JamesClavell@[...].onmicrosoft.com EXTERNAL_USER EXTERNAL GRANT CONNECT

    During troubleshooting it might be useful to check current user's permissions at the various levels via sys.fn_my_permissions:

    -- retrieve database-scoped permissions for current user
    SELECT *
    FROM sys.fn_my_permissions(NULL, 'Database');
    
    -- retrieve schema-scoped permissions for current user
    SELECT *
    FROM sys.fn_my_permissions('SalesLT', 'Schema');
    
    -- retrieve object-scoped permissions for current user
    SELECT *
    FROM sys.fn_my_permissions('SalesLT.Customer', 'Object')
    WHERE permission_name = 'SELECT';
    

    Notes:
    1) See also [1] and [4] in what concerns the limitations that apply to managing permissions in SQL databases.

    Happy coding!

    Previous Post <<||>> Next Post

    References:
    [1] Microsoft Learn (2024) Microsoft Fabric: Share your SQL database and manage permissions [link]
    [2] Microsoft Learn (2024) Microsoft Fabric: Share data and manage access to your SQL database in Microsoft Fabric  [link]
    [3] Microsoft Learn (2024) Authorization in SQL database in Microsoft Fabric [link]
    [4] Microsoft Learn (2024) Authentication in SQL database in Microsoft Fabric [link]

    [5] Microsoft Fabric Learn (2025) Manage access for SQL databases in Microsoft Fabric with workspace roles and item permissions [link

    Related Posts Plugin for WordPress, Blogger...

    About Me

    My photo
    Koeln, NRW, Germany
    IT Professional with more than 25 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.