Showing posts with label D365. Show all posts
Showing posts with label D365. Show all posts

23 March 2025

💫🗒️ERP Systems: Microsoft Dynamics 365's Financial Tags [Notes]

Disclaimer: This is work in progress intended to consolidate information from the various sources and not to provide a complete overview of all the features. Please refer to the documentation for a complete overview!

Last updated: 23-Mar-2025

[Dynamics 365] Financial Tags

  • {def} user-defined metadata elements used to track additional information on accounting entries for analytics or processes purpose
    • provide an additional layer of metadata
    • {objective} eliminate the need to use document numbers, descriptions, or financial dimensions [1]
      • stored on the accounting entries that are created for the transactions [1]
    • {benefit} improved accuracy 
      • ensure each transaction is linked with the correct accounting and auditing elements, enhancing the accuracy in financial reporting and compliance [8]
    • {benefit} streamlined processes 
      • by automating the categorization of financial transactions, financial tags affect a more efficient invoicing process [8]
    • {benefit} better financial track 
      •  allow for granular tracking of expenses and revenues, enabling more detailed financial analysis [8]
    • shown as separate columns on voucher transactions and similar GL inquiry forms 
    • legal entity specific
    • can be shared by using the Shared data feature [3]
    • designed to support any amount of reuse
    • do not default from master data
      • {feature|planned} defaulting will be enabled through user-defined rules
    • similar to financial dimensions
      • an alternative to creating financial dimensions
      • structured (account structures, account rules, validation) 
      • designed for medium to high reuse 
      • the two are most likely mutually exclusive
      • every transaction that supports dimensions will eventually support financial tags 
    • unstructured 
      • no structure, no rules, no validation
    • require a delimiter between the tag values
      • via General ledger parameters >> Financial tags
      • it can be deactivated but not deleted 
        • ⇐ helps ensure that the tag values remain available for reporting on posted general ledger entries can easily be activated and deactivated at any time
    • the label of each financial tag can be changed at any time, even after transactions are posted
      • if transactions have been posted for a specific financial tag, the tag values don't change
    • tag values
      • are associated with an accounting entry
      • can be reused 
      • have header to line defaulting
      • are stored as simple text 
      • do not reference other data 
      • are not validated at any time, including during entry and posting
      • can be entered or edited at any time prior to posting 
      • can be changed at any time after posting 
        • by enabling "Allow edits to internal data on general ledger vouchers" feature
    • up to 20 financial tags can be defined
      • e.g. Customers, Vendors, Projects, PO numbers, Payment references
      • each is 100 characters [1]
  • {type} text 
    • free text with no lookup 
  • {type} custom list
    • free text with lookup 
  • {type} list
    • predefined list of many common types of data with lookup 
      • list values are also not validated
  • supported by
    • general journals
    • customer and vendor payment journals, including entities 
  • {operation} editing
    • values can be entered or edited at any time prior to posting 
    • values can be changed at any time after posting 
      • by enabling "Allow edits to internal data on general ledger vouchers" feature
  • can be disabled at any time [1]
    • any values that were entered for financial tags on transactions will be maintained in the database [1]
      • values will no longer be visible on any transactions or in inquiries [1]
  • journals and transactions support for tags
    • [10.0.32] introduced
    • [10.0.37] [1]
      • general journal, including entities 
      • global general journal
      • allocation journal
      • fixed asset journal
      • all asset leasing journals
      • periodic journal
      • reporting currency adjustment journal
      • customer payment journal, including entities 
      • vendor payment journal, including entities 
      • invoice journal (vendor)
      • global invoice journal (vendor)
      • invoice register
      • SO documents 
        • Sales order, packing slip and customer invoice
        • {feature} "Enable financial tags for sales order invoicing"
      • voucher transactions and Transactions for [account] forms 
      • general journal account entry reporting entity 
      • ledger settlement (manual settlement)
    • [10.0.41|PP] PO documents
      • {feature} "Enable financial tags for purchase order invoicing"
  • {feature} [10.0.42] financial tag rules 
    • allow to enter default value or automatically populate values in financial tags [7]
    • {benefit} ensure consistency and efficiency in transaction tagging [7]
      • ⇐ essential for accurate financial tracking and reporting [7]
    • journals support [7]
      • general journal
      • global general journal
      • allocation journal
      • reporting currency adjustment journal
      • invoice journal (vendor)
    • {operation} Create a financial tag rule
      • via General ledger >> Chart of accounts >> Financial tags >> Financial tags >> New >>
    • {operation} Copy a financial tag rule within legal entity
      • copies a rule that is defined for one transaction entry point to another entry point in the same legal entity [7]
    • {operation} Copy a financial tag to other legal entity
      • copies rules to any legal entity where financial tags are defined and active. Select one or more rules to copy to another legal entity [7]
  • {feature} rule-based defaulting engine for financial tags 
    • e.g. default the vendor name to financial tag XX 
  • {feature} financial tag defaulting rules
  • {feature} valuate storing financial tags directly on subledger data 
    • e.g. store financial tag values in the bank subledger to use with advanced bank reconciliation matching rules
Previous Post <<||>> Next Post

References:
[1] Microsoft Learn (2025) Dynamics 365 Finance: Financial tags [link]
[2] Microsoft Learn (2025) Dynamics 365 Finance: Differences between financial tags and financial dimensions [link]
[3] Microsoft Learn (2025) Dynamics 365 Finance: Microsoft Learn (2022) Financial dimensions [link]
[4] Dynamics on Demand (2025) Financial Tags in Microsoft Dynamics 365 Finance | 10.0.32 [link]
[5] Ramit Paul (2025) Financial Tags in Microsoft Dynamics 365 Finance and Operations [link]
[6] Microsoft Learn (2025) Dynamics 365 Finance: Financial tag rule reference (preview) [link]
[7] Microsoft Learn (2025) Dynamics 365 Finance: Financial tag rules (preview) [link]
[8] Dynamics Global Edge IT Solutions (2024) Financial Tags For Purchase Order Invoicing In MS Dynamics365 F&O [link]

Resources:
[R1] Dynamics365lab (2024) Ep. 120:4 Exploring Financial Tags in Dynamics 365 F&O [link]
[R2] Nextone Consulting (2024) New Feature: Financial Tag Rules in Dynamics 365 SCM 10.0.42 [link]
[R3] Dynamics on Demand (2024) Financial Tags in Microsoft Dynamics 365 Finance | 10.0.32 [link]
[R4] Axcademy (2023) Is this the end to Financial dimensions in D365FO as we know them? [link]
[R5] HItachi Solutions (2024) New Feature in Dynamics 365 Finance - Financial Tags [link]

Acronyms:
D365 F&O - Dynamics 365 for Finance and Operations
GL - General Ledger
GA - General Availability
LE - Legal Entity
PO - Purchase Order
PP - Public Preview
SO - Sales Order

15 March 2025

💫🗒️ERP Systems: Microsoft Dynamics 365's Business Performance Analytics (BPA) [notes]

Disclaimer: This is work in progress intended to consolidate information from the various sources and not to provide a complete overview of all the features. Please refer to the documentation for a complete overview!

Last updated: 15-Mar-2025

[Dynamics 365] Business Performance Analytics (BPA)

  • {def} centralized reporting hub within D365 F&O designed to streamline insights and help organizations make faster, data driven decisions [3]
    • solution designed to transform organization's data into actionable insights [1]
    • provides an intuitive out-of-box data model along with familiar tools like Microsoft Excel and Power BI for self-service analytics [4]
      • data extracted from D365 is classified in BPA in the form of value chains
        • ⇐ a group of business processes on top of the value chain [4]
  • {benefit} allows to simplify data insights by providing a unified view of business data across entities in near real time [4]
  • {benefit} allows to streamline financial and operations reporting to reduce the cycle times [4]
  • {benefit} allows users of all technical abilities to quickly access and analyze data to facilitate data driven decisions [4]
  • {benefit} provides auditors with direct access to financial data, making the audit process more efficient
  • {benefit} enables ease of use through familiar apps like Excel and Power BI, in addition to AI driven insights and automation in this platform that can be scalable and extendable [4]
  • {feature} extends into Microsoft Fabric
    • {benefit} provide a scalable, secure environment for handling large data sets and ensuring insights are always powered by the latest technology [3]
  • {feature} ETL process 
    • involves extracting data from finance and operations database, transforming and loading it into Dataverse [4]
      • each of the entities required for the generation of the dimensional model for the value chains that were mentioned earlier, they are backed by the underlying tables in finance and operations database [4]
    • installed in Dataverse, virtual  entities that are created will then pull in the data into the managed data lake [4]
    • the data is then transformed to generate the dimensional  model which is then pushed into the embedded Power BI workspace in the form of analytical tables [4]
    • BPA consumes this data from Power BI workspace to render the power BI reports [4]
    • this data can also be extended to Fabric if there is a need to consolidate data from multiple sources [4]
  • {feature} reports 
    • designed to provide a detailed overview of an organization's financial health [8]
    • further reports will be added to expand the coverage for the value chains [8]
    • out-of-box reports can't be modified
      • ⇐ users cannot rename, delete or edit these type of reports [8]
      • there’s the option to duplicate the base report and edit the version thus created [8]
    • can be shared with other users who have access to BPA 
      • ⇐ they can receive an in-app notification [8]
      • can be shared over email with another user by entering user’s email address [8] 
      • one can configure whether the recipient can edit or view the report [8]
    •   {feature} allows to create a new Power BI or Excel report from scratch [8]
      • {option} start with a blank report or duplicate an existing report [8]
  • {feature} data refresh
    • automatic data refreshes run currently two times a day [4]
      • at 12:00 AM and 12:00 PM UTC
      • the volume of data is also constrained by the storage capacity of the A3 SKU for Power BI Embedded [1]
        • future release, may support additional data reporting capacity [1]
          • ⇐ so that larger data sets can be reported and analyzed [1]
      • the target is to have refreshes every hour or less [3]
    • data volume will be initially for about eight quarters of data [4]
    • extensibility will be supported with bring your own Fabric [4]
  • architecture
    • SaaS solution
      • {capability} immediate deployment 
        • businesses can start to analyze data and generate insights with minimal setup [1]
      • {capability} comprehensive reporting and dashboards
        • provides access to a wide range of preconfigured reports that cover multiple business functions [1]
      • {capability} near-real-time analytics 
        • future releases will offer more frequent data refreshes to enable near-real-time data analysis and reporting
      • {capability} predictive insights 
        • future releases will introduce predictive analytics capabilities that enable businesses to 
          • forecast trends
          • identify risks
          • seize opportunities [1]
      • {capability} user-friendly interface 
        • intuitive design ⇒ minimal training
          • fosters broader adoption 
          • enables a data-driven culture across the organization [1]
      • {capability} cost-effectiveness
        • available as part of D365 license
          • ⇒ provides advanced analytics without requiring significant investments in IT infrastructure [1]
    • DaaS solution
      • {capability} organizations can integrate its data models with their existing data warehousing infrastructure in Microsoft Fabric [1]
        • maximizes the value of existing data solutions [1]
        • positions businesses for future enhancements [1]
      • {capability} unified and scalable data models
        • customers can build custom models on top of a unified framework
          • ensures consistency and scalability across data sets [1]
      • {capability} future-proofing with automatic upgrades
        • data models integrate seamlessly with future D365 updates
          • reduces manual maintenance and ensures access to the latest features [1]
      • {capability} consistency and standardization
        • data models provide consistency and standardization across data sources
          • ensure high data quality and integrity [1]
      • {capability} advanced analytics and AI 
        • by customizing the data models, organizations can take advantage of advanced analytics and AI capabilities [1]
          • deeper insights without having to develop them from scratch [1]
      • {capability} enhanced data governance
        • unified data models support better data governance by providing standardized data definitions, relationships, and hierarchies [1]
          • ensure consistency and quality across the organization [1]
    • requires an integrated Power Platform environment [5]
      • must be integrated with the Microsoft Entra tenant [5]
    • uses shared Dataverse entitlements [1]
      • includes access to the data lake [1]
  • setup
    • dimensions
      • the selection of dimensions might affect the dimension groups that are created using these dimensions and the users who are assigned there [7]
        • e.g. legal entity, business unit
    • dimension groups
      • users can select specific values for the legal entity, or add a range of values [7]
        • selecting an invalid combination of dimension values, the dimension group will filter out all the records on the report [7]
      • {warning} assigning too many dimension groups to a user, slows the load for that user [7]
    • roles
      • determine which reports the user can access [7]
  • security
    • secure data through role-based access control on top of the value chains [7]
    • the first user who signs into the app is assigned the BPA admin role [7]
      • allows a user to access the administrator section of the BPA [7]
        • where the security can be set up [7]
      • has automatically assigned 
        • Microsoft report viewer role 
        • the All Access Dimension group [7]
          • allow the admin to see the data  in all the reports across all the dimensions [7]
    • {feature} dimension-based role-level security
      • ensures that users only see the data relevant to them based on their role
        •  confidently share reports without duplicating them
          • ⇐ data is automatically filtered by organization's security policies [3]
      • simple but powerful way to maintain control while providing access for teams that love working in Excel [3]
  • accessibility
    • can be accessed through either 
      • Power Platform
        • admins can access BPA app through PowerApps' makeup portal [6]
      • Dynamics 365
        • through the BPA preview shortcut in the homepage or the default dashboard [6]
        • for end users, the BPA preview shortcut is provided when they have certain duties associated to their role(s) [6]
  • licensing
    • included in D365 F&O license [4]
  • requirements
    • requires a tier two environment and Dynamics 365 finance version 1.0.38 or later [5]
  • {project} timeline
    • [2025 wave 1] backup and restore custom reports and analytics
      • {benefit} support better lifecycle management and empower customers to develop on sandbox instances before publishing to production [3]
    • 2025: available in all regions where F&O is available [3]
    • Oct-2024: GA

References:
[1] Microsoft Learn (2024) Dynamics 365 Finance: What is Business performance analytics? [link]
[2] Microsoft Learn (2025) Business performance analytics (BPA) with Dynamics 365 Finance [link]
[3] Dynamics 365 Finance - Business Performance Analytics 2025 Release Wave 1 Release Highlights [link]
[4] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 1 [link]
[5] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 2 [link]
[6] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 3 [link]
[7] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 4 [link]   
[8] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 5 [link]
[9] Microsoft Learn (2024) Dynamics 365: Business performance analytics introduction [link

Acronyms:
AI - Artificial Intelligence
BPA - Business Performance Analytics
D365 F&O - Dynamics 365 for Finance and Operations
DaaS - Data-as-a-Service
ETL - Extract, Transfer, Load
GA - General Availability
MF - Microsoft Fabric
PP - Public Preview
SaaS - Software-as-a-Service
SKU - Stock Keeping Unit
UTC - Coordinated Universal Time

01 March 2025

💎🤖💫SQL Reloaded: Copilot Stories (Part VI: Queries based on Dynamics 365 for Finance & Operations)

As part of ad-hoc queries, SSRS/Paginated Reports and Data Warehouse solutions, during the past 15 years I wrote hundreds of queries based on Dynamics AX, respectively on Dynamics 365 (D365) for Finance, Supply Chain, Production, Human Resources and a few other modules. So, given that the documentation for the D365 schema is available online, I've been wondering how easy would be to write a basic query using M365 Copilot. A relatively simple example would be the join between the Products (InventTable) and Prices (InventTableModule) tables, a quite simple thing to do (the joins were discussed in here).

Disclaimer: The below code was generated by Copilot via prompts used for demonstration purposes!

The first attempt proved to be far from the expectations:

Prompt:
Write a query that joins InventTable and InventTableModule to retrieve the Item Number, Description and the Item Prices from Dynamics 365 for Finance. Include tables' schema and rewrite the query by placing the comma first.

Output:

SELECT 
    , IT.ItemId AS [Item Number]
    , IT.ItemName AS [Description]
    , ITM.Price AS [Item Price]
FROM 
    dbo.InventTable IT
JOIN 
    dbo.InventTableModule ITM 
    ON IT.ItemId = ITM.ItemId 
    AND IT.DataAreaId = ITM.DataAreaId
WHERE 
    ITM.ModuleType = 0 -- Assuming 0 represents the sales module, adjust as necessary

A first version of the query didn't included the join on DataAreaId. Another problem is that compared with the previous versions, in D365 the ItemName is not available in the InventTable anymore while the ItemName was incorrectly considered as Description. The differences between the various versions of D365 are minimal, though the differences between D365 and AX 2009 are considerable with important changes due to redesign.

Prompt:
Rewrite the query to provide the other existing prices via a grouping.

Output:

SELECT 
    , IT.ItemId AS [Item Number]
    , IT.ItemName AS [Description]
    , STRING_AGG(CAST(ITM.Price AS NVARCHAR), ', ') AS [Item Prices]
FROM 
    dbo.InventTable IT
JOIN 
    dbo.InventTableModule ITM 
    ON IT.ItemId = ITM.ItemId 
    AND IT.DataAreaId = ITM.DataAreaId
WHERE 
    ITM.ModuleType = 0 -- Assuming 0 represents the sales module, adjust as necessary
GROUP BY 
    IT.ItemId, IT.ItemName

The generated query shows all the prices as a string, solution which might work if we change the delimiter, though is not an optimal solution at all.

Prompt:
Rewrite the query to provide the other existing individual prices via a grouping in separate fields.

Output:

SELECT 
    , IT.ItemId AS [Item Number]
    , IT.ItemName AS [Description]
    , MAX(CASE WHEN ITM.ModuleType = 0 THEN ITM.Price END) AS [Sales Price]
    , MAX(CASE WHEN ITM.ModuleType = 1 THEN ITM.Price END) AS [Purchase Price]
    , MAX(CASE WHEN ITM.ModuleType = 2 THEN ITM.Price END) AS [Cost Price]
FROM 
    dbo.InventTable IT
JOIN 
    dbo.InventTableModule ITM 
    ON IT.ItemId = ITM.ItemId 
    AND IT.DataAreaId = ITM.DataAreaId
GROUP BY 
    IT.ItemId, IT.ItemName

This version is closer to the original query I wrote some time ago:

-- Item prices via group by
SELECT ITD.DataAreaId 
, ITD.ItemId  
, ITD.InventPrice
, ITD.InventUnitId
, ITD.PurchPrice
, ITD.PurchUnitId
, ITD.SalesPrice
, ITD.SalesUnitId
FROM dbo.InventTable ITM
     LEFT JOIN (-- price details
     SELECT ITD.ItemId
     , ITD.DataAreaId 
     , Max(CASE ITD.ModuleType WHEN 0 THEN ITD.Price END) InventPrice
     , Max(CASE ITD.ModuleType WHEN 0 THEN ITD.UnitId END) InventUnitId
     , Max(CASE ITD.ModuleType WHEN 1 THEN ITD.Price END) PurchPrice
     , Max(CASE ITD.ModuleType WHEN 1 THEN ITD.UnitId END) PurchUnitId
     , Max(CASE ITD.ModuleType WHEN 2 THEN ITD.Price END) SalesPrice
     , Max(CASE ITD.ModuleType WHEN 2 THEN ITD.UnitId END) SalesUnitId
     FROM dbo.InventTableModule ITD
     GROUP BY ITD.ItemId
     , ITD.DataAreaId 
    ) ITD
       ON ITD.ItemId = ITM.ItemId
      AND ITD.DataAreaId = ITM.DataAreaId
WHERE ITD.DataAreaID = 'XYZ'
ORDER BY ITD.ItemId

Prompt:
Rewrite the query to provide the other existing individual prices via a grouping in separate fields.

Output:

SELECT 
    , IT.ItemId AS [Item Number]
    , IT.ItemName AS [Description]
    , Sales.Price AS [Sales Price]
    , Purchase.Price AS [Purchase Price]
    , Cost.Price AS [Cost Price]
FROM 
    dbo.InventTable IT
LEFT JOIN 
    dbo.InventTableModule Sales 
    ON IT.ItemId = Sales.ItemId 
    AND IT.DataAreaId = Sales.DataAreaId 
    AND Sales.ModuleType = 0 -- Assuming 0 represents the sales module
LEFT JOIN 
    dbo.InventTableModule Purchase 
    ON IT.ItemId = Purchase.ItemId 
    AND IT.DataAreaId = Purchase.DataAreaId 
    AND Purchase.ModuleType = 1 -- Assuming 1 represents the purchase module
LEFT JOIN 
    dbo.InventTableModule Cost 
    ON IT.ItemId = Cost.ItemId 
    AND IT.DataAreaId = Cost.DataAreaId 
    AND Cost.ModuleType = 2 -- Assuming 2 represents the cost module

The structure of the query looks promising, though same in the previous query Copilot got the uses for ModuleType wrong (see below)! And here's the query I wrote previously:

-- Item Prices without grouping
SELECT ITM.DataAreaId 
, ITM.ItemId 
, ILP.UnitId InventUnitId 
, ILP.Price InventPrice 
, IPP.UnitId PruchaseUnitId
, IPP.Price PurchasePrice
, ISP.UnitId SalesUnitId
, ISP.Price SalesPrice
FROM dbo.InventTable ITM
      LEFT JOIN dbo.InventTableModule ILP
        ON ITM.ItemId = ILP.ItemId
       AND ITM.DataAreaId = ILP.DataAreaId
       AND ILP.ModuleType = 0 -- Warehouse
      LEFT JOIN dbo.InventTableModule IPP
        ON ITM.ItemId = IPP.ItemId
       AND ITM.DataAreaId = IPP.DatareaId 
       AND IPP.ModuleType = 1 -- Purchases
      LEFT JOIN dbo.InventTableModule ISP
        ON ITM.ItemId = ISP.ItemId
       AND ITM.DataAreaId = ISP.DataAreaId 
       AND ISP.ModuleType = 2 -- Sales	
WHERE ITM.DataAreaId = 'XYZ'

Probably the Copilot for Dynamics 365 Finance and Operations [1] works much better than the one from M365. Unfortunately, I don't have access to it yet! Also, if I would invest more time in the prompts the results would be closer to the queries I wrote. It depends also on the goal(s) considered - build a skeleton on which to build the logic, respectively generate the final query via the prompts. Probably the 80-20 rule applies here as well. 

Frankly, for a person not knowing the D365 data model, the final queries generated are a good point to start (at least for searching for more information on the web) as long Copilot got the prompts right. It will be also interesting to see how business rules related to specific business processes (including customizations) will be handled. The future looks bright (at least for the ones still having a job in the field)!

Happy coding!

Previous Post <<||>> Next Post

References:
[1] Microsoft Learn (2024) Microsoft Learn: Overview of Copilot capabilities in finance and operations apps [link]

25 December 2024

💎🏭SQL Reloaded: Number of Records IV (via sys.partitions DMV)

To get the exact number of records in a table one can use the COUNT (see post) or the more recent COUNT_BIG function, though for big tables this can be an inefficient operation for the database engine:

-- number of records via COUNT
SELECT count(*) row_count
FROM SalesLT.Product

Moreover, sometimes the operation needs to be repeated for a number of tables, e.g. dropdown tables in Dynamics 365 for Finance and Operations (D365 F&O). Writing the query as a UNION allows to export the data as a single table and do comparisons (e.g. in Excel). The same approach can be used also when multiple columns are used for grouping, though one must account for the additional columns in the other subqueries. However, the more tables are involved, the more difficult it becomes to maintain the query over time. 
 
-- number of records via COUNT for multiple tables
SELECT 'SalesLT.Product' table_name
, count(*) row_count
FROM SalesLT.Product
UNION ALL
SELECT 'SalesLT.ProductDescription' table_name
, count(*) row_count
FROM SalesLT.ProductDescription
UNION ALL
SELECT 'SalesLT.ProductModel' table_name
, count(*) row_count
FROM SalesLT.ProductModel

There are many scenarios in which it's needed to get an approximate of the number of records available in a table and doing a record count might prove to be too expensive. For a quick and dirty solution one can use the sys.partitions DMV  instead:

-- number of records via DMV for single object
SELECT object_id
, OBJECT_NAME(object_id) object_name
, OBJECT_SCHEMA_NAME(object_id) schema_name
, SUM(Rows) AS row_count
, data_compression_desc AS compression_type
, COUNT(*) partitions_count
FROM sys.partitions 
WHERE index_id < 2 --ignore the partitions from the non-clustered index if any
  AND OBJECT_ID('SalesLT.Product') = object_id
GROUP BY object_id
, data_compression_desc
ORDER BY row_count DESC;

The query is based on sys.partitions table [1] which contains a row for each partition of all the tables and most types of indexes in the database. The documentation mentions that "rows" indicates the approximate number of rows in the considered partition.

Alternatively, one can bring more tables into the query to extend its range of applicability. 

-- number of records via DMVs
SELECT S.name + '.' + T.name SearchName
, S.Name SchemaName
, T.name TableName
, P.row_count
, P.compression_type
, P.partitions_count
FROM sys.tables T
     LEFT JOIN (
        SELECT object_id
        , SUM(Rows) AS row_count
        , data_compression_desc AS compression_type
        , COUNT(*) partitions_count
        FROM sys.partitions 
        WHERE index_id < 2 --ignore the partitions from the non-clustered index if any
        --AND OBJECT_ID('SalesLT.Product') = object_id
        GROUP BY object_id
        , data_compression_desc
     ) P
    ON T.object_id = P.object_id
     JOIN sys.schemas as S
	   on S.schema_id = T.schema_id
WHERE S.Name = 'SalesLT'
  AND T.Name LIKE 'Product%'
ORDER BY row_count DESC;

The data can be exported regularly to give an idea how tables' cardinality changes over time. One can find this useful as part of the loading process in data warehouses or other solutions (e.g. data migrations). 

By using a FULL JOIN instead of a LEFT JOIN one can retrieve only the tables that have records. 

One should consider only the tables in scope, and eventually remove the records associated with the system objects (e.g. sys or information_schema upon case).

 -- constraints to be added in the WHERE clause to remove the records related to system objects
 AND OBJECT_NAME(object_id) NOT LIKE 'sys%'
 AND OBJECT_NAME(object_id) NOT LIKE 'queue_%' 
 AND OBJECT_NAME(object_id) NOT LIKE 'filestream_tombstone%' 

There are also scenarios in which the count is needed only for a subset of the data. It's the case of D365 F&O (in which the number of records is needed by DataAreaId (aka company) or another field. A solution can be built using the sp_MSForEachTable stored procedure (see the last query from this post) and a cursor.

Notes:
1) Unfortunately, in Microsoft Fabric warehouses the sys.partitions.rows is 0 for all user tables and currently also the sp_MSForEachTable stored procedure can't be used to retrieve the number of records for all tables. However, one can create an old-fashioned cursor for iterating though the collection of tables in scope.
2) The code used in this post is available also in the GitHub repository.

Happy coding and Merry Christmas!

Previous Post <<||>> Next Post

Resources:
[1] Microsoft Learn (2024) sys.partitions (Transact-SQL) [link]
[2] Microsoft Learn (2024) COUNT_BIG (Transact-SQL) [link]

04 March 2024

🧭🏭Business Intelligence: Microsoft Fabric (Part II: Domains and the Data Mesh I -The Challenge of Structure Matching)

Business Intelligence Series
Business Intelligence Series

The holy grail of building a Data Analytics infrastructure seems to be nowadays the creation of a data mesh, a decentralized data architecture that organizes data by specific business domains. This endeavor proves to be difficult to achieve given the various challenges faced  – data integration, data ownership, data product creation and ownership, enablement of data citizens, respectively enforcing security and governance in a federated manner. 

Microsoft Fabric promises to facilitate the creation of data mashes with the help of domains and subdomain by providing built-in security, administration, and governance features associated with them. A domain is a way of logically grouping together all the data in an organization that is relevant to a particular area or field. A subdomain is a way for fine tuning the logical grouping of the data.

Business domains
Business domains & their entities

At high level the challenge of building a data mesh is on how to match or aggregate structures. On one side is the high-level structure of the data mesh, while on the other side is the structure of the business data entities. The data entities can be grouped within a taxonomy with multiple levels that expands to the departments. That’s why it seems somehow natural to consider the departments as the top-most domains of the data mesh. The issue is that if the segmentation starts from a high level, iI becomes inflexible in modeling. Moreover, one has only domains and subdomains, and thus a 2-level structure to model the main aspects of the data mesh.

Some organizations allow unrestricted access to the data belonging to a given department, while others breakdown the access to a more granular level. There are also organizations that don’t restrict the access at all, though this may change later. Besides permissions and a way of grouping together the entities, what value brings to set the domains as departments? 

Therefore, I’m not convinced about using an organizations’ departmental structure as domains, especially when such a structure may change and this would imply a full range of further changes. Moreover, such a structure doesn’t reflect the span of processes or how permissions are assigned for the various roles, which are better reflected on how information systems are structured. Most probably the solution needs to accommodate both perspective and be somehow in the middle. 

Take for example the internal structure of the modules from Dynamics 365 (D365). The Finance area is broken down in Accounts Payable, Accounts Receivables, Fixed Assets, General Ledger, etc. In some organizations the departments reflect this delimitation to some degree, while in others are just associated with finance-related roles. Moreover, the permissions are more granular and, reflecting the data entities the users work with. 

Conversely, SCM extends into Finance as Purchase orders, Sales orders and other business documents are the starting or intermediary points of processes that span modules. Similarly, there are processes that start in CRM or other systems. The span of processes seem to be more appropriate for structuring the data mesh, though the system overlapping with the roles involved in the processes and the free definition of process boundaries can overcomplicate the whole design.

It makes sense to define the domains at a level that resembles the structure of the modules available in D365, while the macro data-entities represent the subdomain. The subdomain would represent then master as well as transactional data entities from the perspective of the domains, with there will be entities that need to be shared between multiple domains. Such a structure has less chances to change over time, allowing more flexibility and smaller areas of focus and thus easier to design, develop, test, deploy and maintain.

Previous Post <<||>> Next Post

10 January 2024

📦Data Migrations (DM): Conceptualization (Part VIII: Create a Data Migration Strategy for Dynamics 365)

Data Migration
Data Migration Series

Microsoft made available a module on creating a Data Migration strategy (DMS) for Dynamics 365 solutions [1] and provides a template for the DMS workshop on GitHub. Despite the all-encompassing title, the module focuses mainly on the latter. Here are several points to consider in addition.

The workshop allows to gather most important aspects related to the DM, however it's useful to make the strategy explicit as part of a document and share it with the team. The DMS document should follow the general elaboration of a strategy by providing information about the current and targeted state, goals and objectives, assumptions, dependencies, risks and opportunities, the architecture (including alternatives that need to be explored), direct and interfacing processes, roles involved and their responsibilities, communication paths (including escalation), etc. 

Even if the template discusses general aspects, the treatment of best practices is minimal (see some technical and architectural aspects).

The DMS should describe when possible the high-level architecture planned for DM. The architecture section can be updated as the project progresses. 

One can plan for 3-4 dry-runs plus the UAT, though further iterations might be needed depending on the data quality, respectively the complexity of the logic needed. Each dry-run will have its own scope (in terms of entities and number of records) and objectives (in terms of passed tests), while the UAT attempts to mimic the same conditions the as DM for Go-Live.

Several iterations are usually needed to identify the entities in scope. Moreover, the scope will suffer several changes as new entities and migration areas are identified. There are high the chances that some data entities will be forgotten or postponed after Go Live. Upon case, the business might approach a phased implementation, each with their own DM,

It's useful to have a list of the entities available in the organization and map them to the entities available in the target system(s). This allows to identify the gaps on the DM side. There will be entities for which data need to be provided, entities for which there's no counterpart in the target system, respectively entities with different levels of detail.

An important step of the process is defining the business rules that apply - what subset of data is migrated (e.g. active referenced master data, open transactions, etc.). The definitions will suffer further changes as business perception about what's needed changes.

As they'll go through multiple changes, it's useful to have such lists in Excel or similar repositories and reference them in the strategy and further documents. 

The strategy can also target the post-Go-Live migration activities, though the process and architecture might be slightly different, even if many of the requirements still apply. One can devise the further aspects as part of the DMS or consider them as part of the Data Management strategy and operations.

 Despite the considerable overlapping of the DM with Data Management, at least until now the DM is not considered as part of the latter. Big organizations with periodic DMs might need to consider a broader strategy, following to address punctually the deviations from each project. 

The DM is usually considered as a subproject of the implementation and should be managed as such. Depending on its scope and characteristics (principles, acceptance criteria) the DM might use a different methodology and timeline than the main project. Defining project's goals/objectives, outcomes, stakeholders, timeline, phases, and milestones, respectively the high-level assumptions, dependencies, risks, opportunities and constraints is Project Management 101.

The DM is critical for an implementation's success. It would be useful to describe the architectural alternatives, as well to provide a series of architectural recommendations.

Previous Post <<||>> Next Post

Resources:
[1] Microsoft learn (2023) Create a data migration strategy for Dynamics 365 solutions (link)

08 January 2024

💫🗒️ERP Systems: Microsoft Dynamics 365's Fixed Assets [Note]

Disclaimer: This is work in progress intended to consolidate information from various sources.

[Dynamics 365] Fixed Assets (aka FA)

  • {def} assets that a company owns and uses in the daily operations of the company and are not intended for resale to customers [1]
    • ⇐ the same asset might be classified as a FA in one company and as a current asset or inventory item in another [1]
  • represent permanent value and not just expenditures in the year of acquisition [1]
  • typically depreciated, or expensed, over their useful life
    • other adjustments might also be necessary
    • their useful lives span multiple years [1]
  • treated as balance sheet transactions in the year that they are acquired, and are posted as an asset to a balance sheet account [1]
  • {action} review asset value in the balance sheet (net book value) 
    • should be reviewed at least one time each year  [1]
      • it can be reviewed monthly, quarterly, semi-annually, or annually [1]
    • an adjustment of the asset value in the balance sheet (write-down or write-up) might be necessary  [1]
      • the adjustments are usually caused by some extraordinary occurrences in the market that affect the price if the company were to reacquire the asset. [1]
      • ⇐ accounting principles in some countries or regions prohibit the write-up of assets [1]
  • {action} remove asset from the accounting books
    • when a company no longer has use for the asset [1]
      • ⇐ because it is either being sold or scrapped [1]
    • the original acquisition price and accumulated depreciation of the asset are reversed, and any surplus or loss from the disposal is posted to the profit and loss statement [1]
  • {action} dispose asset
    • the fixed asset transactions that were previously posted are reversed out of the original accounts [1]
    • the net amounts are then moved to the appropriate account for gain and loss for asset disposal [1]
  • Fixed asset groups 
    • allow specifying default attributes for every FA that is assigned to a group [1]
      • include the service life and the depreciation convention.
      • FAs are assigned to a group when they are created [1]
    • the only required field when creating a FA [1]
  • Books 
    • track the financial value of a fixed asset over time by using the depreciation configuration that is defined in the depreciation profile
      • ⇐ each book tracks an independent financial lifecycle of an asset [1]
    • assigned to fixed asset groups
      • {default} the books assigned to the FA group are then assigned to the FA [1]
    •  books that are configured to post to the GL are associated with a posting profile
      • GL accounts are defined for each book in the posting profile and are used when fixed asset transactions are posted [1]
      • ⇐ the configuration is the default setting because it's typically used for corporate financial reporting [1]
    • books that don't post to the GL post only to the Fixed asset subledger and are typically used for tax reporting purposes [1]
      • this approach allows the flexibility to delete historical transactions for the asset book because they haven't been committed to GL [1]
      • {field} Post to general ledger
        • {No}disables posting to the general ledger for the book
          • sets the Posting layer field automatically to None
    • each book that is attached to a FA is set up for a particular posting layer that has an overall depreciation objective [1] 
    • derived books
      • setup in addition to the standard books [1]
      • simplify the posting of FA book transactions that are planned for regular intervals
        • one book is chosen as the primary book, which is usually the one used for accounting depreciation [1]
          • then attach to it other books that are set up to post transactions in the same intervals as the primary book [1]
            • ⇐ tax depreciation books are often set up as derived books [1]
        • allow posting transactions to different posting layers at the same time [1]
        • acquisitions, acquisition adjustments and disposals can be set up with it
          • use acquisition when the book and the derived book should be applied to the FA from the time of the fixed asset acquisition [1]
          • if the primary book and the derived books have the same intervals regarding sale or disposal, all FA transaction types are available for the setup of a derived book [1]
          • depreciation posted in the derived book will be the same amount as was posted for the primary book [1]
            • if the depreciation methods are different between the books, one should not generate depreciation transactions by using the derived process [1]
              • ⇐ the books can then be attached to specific fixed assets [1]
      • the specified derived transactions are posted against the derived books as an exact copy of the primary transaction [1]
        • ⇒ derived transactions are typically set up for acquisitions and disposals, not for depreciation transactions [1]
      • when posting transactions for a book that contains derived books, the derived book transactions are posted automatically in journals, purchase orders, or free text invoices [1]
      • when preparing the primary book transactions in the FA journal, one can view and modify the amounts of the derived transactions before posting them [1]
      • certain accounts are updated only once by postings of the primary book [1]
        • e.g. sales tax and customer or vendor accounts
      • derived book transactions are posted to the accounts that have been defined for the derived book in the FA posting profiles page [1]
      • {exception} books that are set up to post transactions at intervals other than the primary book intervals must be attached to the FA as separate books and not as derived books [1]
  • depreciation
    • an entry that expenses the part of the asset's original purchase price that was used during the year [1]
    • FAs are depreciated in different ways for different purposes
      • depreciation for tax purposes 
        • calculated by using current tax rules to achieve the highest possible depreciation before taxes [1]
      • depreciation for reporting purposes
        • calculated according to accounting laws and standards [1]
      • the various kinds of depreciation are calculated and recorded separately in the posting layers [1]
  • depreciation methods
    • straight line
      • computed by taking the costs of the acquisition and dividing those costs by the expected service life of the asset [1]
        • the rules are usually defined in the local legislation [1]
  • depreciation profiles 
    • allow configuring via rules how the value of an asset is depreciated over time
      • it's needed to define the method of depreciation, the depreciation year (calendar year or fiscal year), and the frequency of depreciation [1]
    • a primary depreciation profile is assigned to every book
      • ⇐ books also have an alternative (aka switchover depreciation) profile, if this type of profile is applicable [1]
        • the depreciation proposal will switch to this profile when the alternative profile calculates a depreciation amount that is equal to or greater than the default depreciation profile [1]
      • ⇐ enable the Calculate depreciation option to automatically include the FA book in depreciation runs [1]
    • usually refers to tangible assets, but intangible assets can also use depreciation profiles [1]
    • financial assets are considered non-depreciable [1]
      •  e.g. shares of stocks or bonds 
    • extraordinary depreciation profile
      • used for additional depreciation of an asset in unusual circumstances [1]
        • e.g. record depreciation that results from a natural disaster
    • depreciation calculation methods:
      • Straight line service life
      • Reducing balance
      • Manual
      • Factor
      • Consumption
      • Straight line life remaining
      • 200% reducing balance
      • 175% reducing balance
      • 150% reducing balance
      • 125% reducing balance
      • ⇐ most companies use one or more of the straight-line methods, one or more of the reducing balance methods, or the manual method [1]
      • ⇐ the rationale for all methods is to allocate or accrue the depreciable value of the asset into accounting periods [1]
        • the depreciable value of the asset is the acquisition price reduced by a scrap value, if any
      • ⇐ many depreciation profiles with the same depreciation method can be created for different calculations [1]
  • posting profile 
    • must be defined for each book that will post to GL, but it can also be defined at a more detailed level [1]
      • e.g. can be defined for the combination of a book and a FA group, or even for an individual FA book [1]
      • Set up a posting profile record general ledger
    • for each posting profile, one must select the relevant transaction type and book, and then designate the ledger accounts [1]
  • {option} Create depreciation adjustments with basis adjustments 
    • {selected} depreciation adjustments will be automatically created when the value of the asset is updated [1]
    • {not selected} the updated asset value will only affect depreciation calculations going forward [1]
    • {not enabled} for an asset, the depreciation proposal skips the asset [1]
  • {optional} define special depreciation allowances, or bonus depreciation, for a specific combination of a fixed asset group and a book [1]
    • assign a priority to the special depreciation allowance to specify the order in which allowances are calculated when multiple allowances are assigned to a book [1]
  • Journal names 
    • create the journal names that should be used with the FA journal.
      • set the Journal type field to 'Post fixed assets'
      • set the Voucher series field so that the journal names are used for the FA journal [1]
      • FA journals should not use the One voucher number only setting
        • ⇐ because a unique voucher number is required for several automated processes, such as transfers and splits [1]
    • each journal used for depreciation is defined by its journal name for only one posting layer [1]
      • the posting layer in the journal can't be changed [1]
        • the restriction helps guarantee that transactions for each posting layer are kept separate [1]
      • at least one journal name must be created for each posting layer [1]
        • for books that don't post to GL create a journal where the posting layer is set to None [1]
  • {parameter} Capitalization threshold
    • determines the assets that are depreciated
    • if a purchase line is selected as a fixed asset, but it doesn't meet the specified capitalization threshold, a fixed asset is still created or updated, but the Calculate prorated depreciation option is set to No. Therefore, the asset won't be automatically depreciated as part of the depreciation proposals.
  • {option} Automatically create depreciation adjustment amounts with disposal
    • {Yes}, the asset depreciation is automatically adjusted, based on the depreciation settings at the time of asset disposal [1]
  • [Purchase orders] 
    • {option} Allow asset acquisition from Purchasing
      • {Yes} asset acquisition occurs when the invoice is posted [1]
      • {No} the FA can be put on a PO and invoice, but the acquisition won't be posted [1]
        • the posting must be done as a separate step, from the FA journal
    • {option} Create asset during product receipt or invoice posting 
      • {enabled} a new asset is created while posting
        • ⇒ the asset doesn't have to be set up as a fixed asset before the transaction
  • [Purchase requisitions] {option} Check for fixed assets creation during line entry
  • Fixed asset number sequence 
    • can be overridden by the FA group number sequence if it has been specified [1]
  • depreciation conventions for groups of assets that are part of the same book
    • {action} updating the depreciation convention
      • delete all depreciation transactions that exist for those assets [1]
      • delete all transactions for depreciation adjustments, transactions for bonus depreciation, and transactions for extraordinary depreciation for those assets [1]
    • {action} update the depreciation convention for assets that have already been disposed
      • delete first the existing disposal transactions [1]
      • delete all transactions that were generated because of the disposal process [1]
    • {action} after updating the depreciation convention for assets
      • process depreciation and extraordinary depreciation for each asset [1]
      • make manual depreciation adjustments, if any adjustments are required [1]
  • [General Ledger] main accounts 
    • {default} the ledger accounts that are defined are used for your FA transactions [1]
    • {recommendation} define the ledger accounts that are used during the disposal processes, both disposal sales and disposal scraps [1]
    • {recommendation} to help guarantee that transactions are correctly reversed, set up accounts for each type of transaction that you use in your business [1]
      • the main account should be the original account set on the posting profile for the transaction type, and the offset account should the gain and loss for disposal account [1]
      • {exception} for the net book value, the main account and the offset account should be set to the gain and loss for disposal account [1]
      • various combinations of main accounts can be created depending on the level of detail needed in GL [1]
    • can be based on transaction types, books, and other main accounts [1]
    • the value of all FAs is typically summarized in multiple main accounts that are required for financial reporting [1]
    • each time a transaction is posted for a FA, the appropriate main accounts are updated
      • ⇐ the main accounts always show the updated value [1]
    • the main accounts that are used for posting are determined by the options for ledger integration that are specified for the item model group [1]
      • the main accounts that are used vary, depending on whether an asset is assigned to the purchase order line [1]
      • the accounts are derived from the posting profile for each item group [1]
  • [Inventory management]
    • inventory items can be transferred to FAs either as an acquisition or as part of an acquisition [1]
      • an inventory item becomes a FA acquisition through 
        • inventory journal
          • one can enter the acquisition of FA that the LE has produced or constructed for itself [1]
        • PO
          • one PO line creates one fixed asset, regardless of the quantity [1]
          • items can be set up for both resale and internal use [1]
            • using specific receipt and issue accounts on item groups for FAs, one can use the same inventory item both for internal purchases and as stock for resale [1]
        • acquisition proposal
        • ⇐ with it a FA book acquisition transaction is created [1]
          • if a book acquisition includes a derived book, the derived book acquisition transaction is also created [1]
    • the effect the acquisition of fixed assets has on inventory depends on LE's setup [1]
      • posting rules control the decrease in inventory when an acquisition is posted [1]
        •  the inventory isn't always decreased when posting invoices related to FA [1]
    • FAs for internal use must have an account type of 
      • 'Fixed asset receipt'
        • used to track FA's receipt
        • when posting a Vendor invoice, use the FA receipt account if any of conditions hold:
          • the invoice line contains an existing FA for internal purposes [1]
          • the New fixed asset? check box is selected for the product receipt line that is posted [1]
          • the 'Create a new fixed asset' check box is selected for the vendor invoice line [1]
        • typically, this account is an expense account
      • 'Fixed asset issue' 
        • used to track the issuing of the FA to the recipient [1]
        • when an asset is acquired by using a PO, the FA issue account offsets the FA debit account [1]
        • the asset acquisition can be posted either when posting the Vendor invoice or when posting the asset acquisition in the FA journal, possibly by using an acquisition proposal [1]
      • ⇐ setup for either an item group or an individual item by using the Purchase order tab on the Item group or Posting page [1]
    • if an inventory reservation exists when product receipts are posted, one can’t assign or create a FA from the line [1]
    • the accounts that FA transactions are posted to depend on two factors:
      • whether the assets are purchased or constructed by the LE [1]
      • the transaction type of the asset
        • connects the inventory transaction to the posting profile in FAs [1]
        • because the posting profile in FAs defines which accounts are updated, the selection of a transaction type for a FA is also, indirectly, the selection of the main accounts that the transaction is posted to [1]
        • for both constructed and purchased fixed assets, the transaction type is typically Acquisition or Acquisition adjustment [1]
  • [Accounts receivable]
    • the integration uses posting profiles that are set up in FAs
      • activated when a FA, book, and FA transaction type is selected for a Customer invoice before the customer invoice is posted [1]
      • selling a FA must be done via Free text invoices
        • ⇐ because FAs aren’t part of Inventory management [1]
  • [Accounts payable]
    • typically, FAs are acquired from external vendors
  • asset acquisitions are posted either
    • when posting the Vendor invoices
      • FA accounts are updated whenever a Vendor invoice for a FA acquisition is posted [1]
      • the transaction is posted according to the posting profiles that are set up in FAs for the various FA transaction types [1]
    • from FAs
  • [Project management and accounting]
    • a project can be associated with a FA asset that is affected by the project
      • each phase, task, or subproject can be associated to a different asset [1]
      • one FA can be associated with each project record [1]
      • the association is created via the Fixed asset number field on the Projects page [1]
        • typically, applies to projects related to work, maintenance, or improvements for the FA [1]
      • the project type must be either Internal or Cost project
      • clear the Fixed asset number field on the Projects page to delete the association [1]
    • when the project is completed, a write-up adjustment for the asset isn’t created automatically [1]
      • ⇒ must be created manually [1] 
    • one can designate a FA created or manufactured as part of an estimate project [1] 
      • at the end of an estimate project, one can automatically post a FA acquisition transaction [1]

Resources:
[1] Microsoft Learn (2023) Configure fixed assets management in Dynamics 365 Finance [link]

Acronyms:
D365 F&O - Dynamics 365 for Finance and Operations
FA - Fixed Asset
GL - General Ledger
LE - Legal Entity
PO - Purchase Order

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 25 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.