Showing posts with label D365. Show all posts
Showing posts with label D365. Show all posts

25 April 2025

💫🗒️ERP Systems: Microsoft Dynamics 365's Business Process Catalog (BPC) [Notes]

Disclaimer: This is work in progress intended to consolidate information from the various sources and not to provide a complete overview of all the features. Please refer to the documentation for a complete overview!

Last updated: 25-Apr-2025

Business Process Catalog - End-to-End Scenarios

[Dynamics 365] Business Process Catalog (BPC)

  • {def} lists of end-to-end processes that are commonly used to manage or support work within an organization [1]
    • agnostic catalog of business processes contained within the entire D365 solution space [3]
      • {benefit} efficiency and time savings [3]
      • {benefit} best practices [3]
      • {benefit} reduced risk [3]
      • {benefit} technology alignment [3]
      • {benefit} scalability [3]
      • {benefit} cross-industry applicability [3]
    • stored in an Excel workbook
      • used to organize and prioritize the work on the business process documentation [1]
      • {recommendation} check the latest versions (see [R1])
    • assigns unique IDs to 
      • {concept} end-to-end scenario
        • describe in business terms 
          • not in terms of software technology
        • includes the high-level products and features that map to the process [3]
        • covers two or more business process areas
        • {purpose} map products and features to benefits that can be understood in business contexts [3]
      • {concept} business process areas
        • combination of business language and basic D365 terminology [3]
        • groups business processes for easier searching and navigation [1]
        • separated by major job functions or departments in an organization [1]
        • {purpose} map concepts to benefits that can be understood in business context [3]
        • more than 90 business process areas defined [1]
      • {concept} business processes
        • a series of structured activities and tasks that organizations use to achieve specific goals and objectives [3]
          • efficiency and productivity
          • consistency and quality
          • cost reduction
          • risk management
          • scalability
          • data-driven decision-making
        • a set of tasks in a sequence that is completed to achieve a specific objective [5]
          • define when each step is done in the implementation [5] [3]
          • define how many are needed [5] [3]
        • covers a wide range of structured, often sequenced, activities or tasks to achieve a predetermined organizational goal
        • can refer to the cumulative effects of all steps progressing toward a business goal
        • describes a function or process that D365 supports
          • more than 700 business processes identified
          • {goal} provide a single entry point with links to relevant product-specific content [1]
        • {concept} business process guide
          • provides documentation on the structure and patterns of the process along with guidance on how to use them in a process-oriented implementation [3]
          • based on a catalog of business process supported by D365 [3]
        • {concept} process steps 
          • represented sequentially, top to bottom
            • can include hyperlinks to the product documentation [5] 
            • {recommendation} avoid back and forth in the steps as much as possible [5]
          • can be
            • forms used in D365 [5]
            • steps completed in LCS, PPAC, Azure or other Microsoft products [5]
            • steps that are done outside the system (incl. third-party system) [5]
            • steps that are done manually [5]
          • are not 
            • product documentation [5]
            • a list of each click to perform a task [5]
        • {concept} process states
          • include
            • project phase 
              • e.g. strategize, initialize, develop, prepare, operate
            • configuration 
              • e.g. base, foundation, optional
            • process type
              • e.g. configuration, operational
      • {concept} patterns
        • repeatable configurations that support a specific business process [1]
          • specific way of setting up D365 to achieve an objective [1]
          • address specific challenges in implementations and are based on a specific scenario or best practice [6]
          • the solution is embedded into the application [6]
          • includes high-level process steps [6]
        • include the most common use cases, scenarios, and industries [1]
        • {goal} provide a baseline for implementations
          • more than 2000 patterns, and we expect that number to grow significantly over time [1]
        • {activity} naming a new pattern
          • starts with a verb
          • describes a process
          • includes product names
          • indicate the industry
          • indicate AppSource products
      • {concept} reference architecture 
        • acts as a core architecture with a common solution that applies to many scenarios [6]
        • typically used for integrations to external solutions [6]
        • must include an architecture diagram [6]
    • {concept} process governance
      • {benefit} improved quality
      • {benefit} enhanced decision making
      • {benefit} agility adaptability
      • {benefit{ Sbd alignment
      • {goal} enhance efficiency 
      • {goal} ensure compliance 
      • {goal} facilitate accountability 
      • {concept} policy
      • {concept} procedure
      • {concept} control
    • {concept} scope definition
      • {recommendation} avoid replicating current processes without considering future needs [4]
        • {risk} replicating processes in the new system without re-evaluating and optimizing [4] 
        • {impact} missed opportunities for process improvement [4]
      • {recommendation} align processes with overarching business goals rather than the limitations of the current system [4]
    • {concept} guidance hub
      • a central landing spot for D365 guidance and tools
      • contains cross-application documentations
  • {purpose} provide considerations and best practices for implementation [6]
  • {purpose} provide technical information for implementation [6]
  • {purpose} provide link to product documentation to achieve the tasks in scope [6]
Previous Post <<||>> Next Post 

References:
[1] Microsoft Learn (2024) Dynamics 365: Overview of end-to-end scenarios and business processes in Dynamics 365 [link]
[2] Microsoft Dynamics 365 Community (2023) Business Process Guides - Business Process Guides [link]
[3] Microsoft Dynamics 365 Community (2024) Business Process Catalog and Guidance - Part 2 Introduction to Business Processes [link]
[4] Microsoft Dynamics 365 Community (2024) Business Process Catalog and Guidance - Part 3: Using the Business Process Catalog to Manage Project Scope and Estimation [link]
[5] Microsoft Dynamics 365 Community (2024) Business Process Catalog and Guidance - Part 4: Authoring Business Processes [link]
[6] Microsoft Dynamics 365 Community (2024) Business Process Catalog and Guidance - Part 5:  Authoring Business Processes Patterns and Use Cases [link]
[7] Microsoft Dynamics 365 Community (2024) Business Process Catalog and Guidance  - Part 6: Conducting Process-Centric Discovery [link]
[8] Microsoft Dynamics 365 Community (2024) Business Process Catalog and Guidance  - Part 7: Introduction to Process Governance [link]

Resources:
[R1] GitHub (2024) Business Process Catalog [link]
[R2] Microsoft Learn (2024) Dynamics 365 guidance documentation and other resources [link]
[R3] Dynamics 365 Blog (2025) Process, meet product: The business process catalog for Dynamics 365 [link]

Acronyms:
3T - Tools, Techniques, Tips
ADO - 
BPC - Business Process Catalog
D365 - Dynamics 365
LCS - Lifecycle Services
PPAC - Power Platform admin center
RFI - Request for Information
RFP - Request for Proposal

23 March 2025

💫🗒️ERP Systems: Microsoft Dynamics 365's Financial Tags [Notes]

Disclaimer: This is work in progress intended to consolidate information from the various sources and not to provide a complete overview of all the features. Please refer to the documentation for a complete overview!

Last updated: 23-Mar-2025

[Dynamics 365] Financial Tags

  • {def} user-defined metadata elements used to track additional information on accounting entries for analytics or processes purpose
    • provide an additional layer of metadata
    • {objective} eliminate the need to use document numbers, descriptions, or financial dimensions [1]
      • stored on the accounting entries that are created for the transactions [1]
    • {benefit} improved accuracy 
      • ensure each transaction is linked with the correct accounting and auditing elements, enhancing the accuracy in financial reporting and compliance [8]
    • {benefit} streamlined processes 
      • by automating the categorization of financial transactions, financial tags affect a more efficient invoicing process [8]
    • {benefit} better financial track 
      •  allow for granular tracking of expenses and revenues, enabling more detailed financial analysis [8]
    • shown as separate columns on voucher transactions and similar GL inquiry forms 
    • legal entity specific
    • can be shared by using the Shared data feature [3]
    • designed to support any amount of reuse
    • do not default from master data
      • {feature|planned} defaulting will be enabled through user-defined rules
    • similar to financial dimensions
      • an alternative to creating financial dimensions
      • structured (account structures, account rules, validation) 
      • designed for medium to high reuse 
      • the two are most likely mutually exclusive
      • every transaction that supports dimensions will eventually support financial tags 
    • unstructured 
      • no structure, no rules, no validation
    • require a delimiter between the tag values
      • via General ledger parameters >> Financial tags
      • it can be deactivated but not deleted 
        • ⇐ helps ensure that the tag values remain available for reporting on posted general ledger entries can easily be activated and deactivated at any time
    • the label of each financial tag can be changed at any time, even after transactions are posted
      • if transactions have been posted for a specific financial tag, the tag values don't change
    • tag values
      • are associated with an accounting entry
      • can be reused 
      • have header to line defaulting
      • are stored as simple text 
      • do not reference other data 
      • are not validated at any time, including during entry and posting
      • can be entered or edited at any time prior to posting 
      • can be changed at any time after posting 
        • by enabling "Allow edits to internal data on general ledger vouchers" feature
    • up to 20 financial tags can be defined
      • e.g. Customers, Vendors, Projects, PO numbers, Payment references
      • each is 100 characters [1]
  • {type} text 
    • free text with no lookup 
  • {type} custom list
    • free text with lookup 
  • {type} list
    • predefined list of many common types of data with lookup 
      • list values are also not validated
  • supported by
    • general journals
    • customer and vendor payment journals, including entities 
  • {operation} editing
    • values can be entered or edited at any time prior to posting 
    • values can be changed at any time after posting 
      • by enabling "Allow edits to internal data on general ledger vouchers" feature
  • can be disabled at any time [1]
    • any values that were entered for financial tags on transactions will be maintained in the database [1]
      • values will no longer be visible on any transactions or in inquiries [1]
  • journals and transactions support for tags
    • [10.0.32] introduced
    • [10.0.37] [1]
      • general journal, including entities 
      • global general journal
      • allocation journal
      • fixed asset journal
      • all asset leasing journals
      • periodic journal
      • reporting currency adjustment journal
      • customer payment journal, including entities 
      • vendor payment journal, including entities 
      • invoice journal (vendor)
      • global invoice journal (vendor)
      • invoice register
      • SO documents 
        • Sales order, packing slip and customer invoice
        • {feature} "Enable financial tags for sales order invoicing"
      • voucher transactions and Transactions for [account] forms 
      • general journal account entry reporting entity 
      • ledger settlement (manual settlement)
    • [10.0.41|PP] PO documents
      • {feature} "Enable financial tags for purchase order invoicing"
  • {feature} [10.0.42] financial tag rules 
    • allow to enter default value or automatically populate values in financial tags [7]
    • {benefit} ensure consistency and efficiency in transaction tagging [7]
      • ⇐ essential for accurate financial tracking and reporting [7]
    • journals support [7]
      • general journal
      • global general journal
      • allocation journal
      • reporting currency adjustment journal
      • invoice journal (vendor)
    • {operation} Create a financial tag rule
      • via General ledger >> Chart of accounts >> Financial tags >> Financial tags >> New >>
    • {operation} Copy a financial tag rule within legal entity
      • copies a rule that is defined for one transaction entry point to another entry point in the same legal entity [7]
    • {operation} Copy a financial tag to other legal entity
      • copies rules to any legal entity where financial tags are defined and active. Select one or more rules to copy to another legal entity [7]
  • {feature} rule-based defaulting engine for financial tags 
    • e.g. default the vendor name to financial tag XX 
  • {feature} financial tag defaulting rules
  • {feature} valuate storing financial tags directly on subledger data 
    • e.g. store financial tag values in the bank subledger to use with advanced bank reconciliation matching rules

References:
[1] Microsoft Learn (2025) Dynamics 365 Finance: Financial tags [link]
[2] Microsoft Learn (2025) Dynamics 365 Finance: Differences between financial tags and financial dimensions [link]
[3] Microsoft Learn (2025) Dynamics 365 Finance: Microsoft Learn (2022) Financial dimensions [link]
[4] Dynamics on Demand (2025) Financial Tags in Microsoft Dynamics 365 Finance | 10.0.32 [link]
[5] Ramit Paul (2025) Financial Tags in Microsoft Dynamics 365 Finance and Operations [link]
[6] Microsoft Learn (2025) Dynamics 365 Finance: Financial tag rule reference (preview) [link]
[7] Microsoft Learn (2025) Dynamics 365 Finance: Financial tag rules (preview) [link]
[8] Dynamics Global Edge IT Solutions (2024) Financial Tags For Purchase Order Invoicing In MS Dynamics365 F&O [link]

Resources:
[R1] Dynamics365lab (2024) Ep. 120:4 Exploring Financial Tags in Dynamics 365 F&O [link]
[R2] Nextone Consulting (2024) New Feature: Financial Tag Rules in Dynamics 365 SCM 10.0.42 [link]
[R3] Dynamics on Demand (2024) Financial Tags in Microsoft Dynamics 365 Finance | 10.0.32 [link]
[R4] Axcademy (2023) Is this the end to Financial dimensions in D365FO as we know them? [link]
[R5] HItachi Solutions (2024) New Feature in Dynamics 365 Finance - Financial Tags [link]

Acronyms:
D365 F&O - Dynamics 365 for Finance and Operations
GL - General Ledger
GA - General Availability
LE - Legal Entity
PO - Purchase Order
PP - Public Preview
SO - Sales Order

15 March 2025

💫🗒️ERP Systems: Microsoft Dynamics 365's Business Performance Analytics (BPA) [notes]

Disclaimer: This is work in progress intended to consolidate information from the various sources and not to provide a complete overview of all the features. Please refer to the documentation for a complete overview!

Last updated: 15-Mar-2025

[Dynamics 365] Business Performance Analytics (BPA)

  • {def} centralized reporting hub within D365 F&O designed to streamline insights and help organizations make faster, data driven decisions [3]
    • solution designed to transform organization's data into actionable insights [1]
    • provides an intuitive out-of-box data model along with familiar tools like Microsoft Excel and Power BI for self-service analytics [4]
      • data extracted from D365 is classified in BPA in the form of value chains
        • ⇐ a group of business processes on top of the value chain [4]
  • {benefit} allows to simplify data insights by providing a unified view of business data across entities in near real time [4]
  • {benefit} allows to streamline financial and operations reporting to reduce the cycle times [4]
  • {benefit} allows users of all technical abilities to quickly access and analyze data to facilitate data driven decisions [4]
  • {benefit} provides auditors with direct access to financial data, making the audit process more efficient
  • {benefit} enables ease of use through familiar apps like Excel and Power BI, in addition to AI driven insights and automation in this platform that can be scalable and extendable [4]
  • {feature} extends into Microsoft Fabric
    • {benefit} provide a scalable, secure environment for handling large data sets and ensuring insights are always powered by the latest technology [3]
  • {feature} ETL process 
    • involves extracting data from finance and operations database, transforming and loading it into Dataverse [4]
      • each of the entities required for the generation of the dimensional model for the value chains that were mentioned earlier, they are backed by the underlying tables in finance and operations database [4]
    • installed in Dataverse, virtual  entities that are created will then pull in the data into the managed data lake [4]
    • the data is then transformed to generate the dimensional  model which is then pushed into the embedded Power BI workspace in the form of analytical tables [4]
    • BPA consumes this data from Power BI workspace to render the power BI reports [4]
    • this data can also be extended to Fabric if there is a need to consolidate data from multiple sources [4]
  • {feature} reports 
    • designed to provide a detailed overview of an organization's financial health [8]
    • further reports will be added to expand the coverage for the value chains [8]
    • out-of-box reports can't be modified
      • ⇐ users cannot rename, delete or edit these type of reports [8]
      • there’s the option to duplicate the base report and edit the version thus created [8]
    • can be shared with other users who have access to BPA 
      • ⇐ they can receive an in-app notification [8]
      • can be shared over email with another user by entering user’s email address [8] 
      • one can configure whether the recipient can edit or view the report [8]
    •   {feature} allows to create a new Power BI or Excel report from scratch [8]
      • {option} start with a blank report or duplicate an existing report [8]
  • {feature} data refresh
    • automatic data refreshes run currently two times a day [4]
      • at 12:00 AM and 12:00 PM UTC
      • the volume of data is also constrained by the storage capacity of the A3 SKU for Power BI Embedded [1]
        • future release, may support additional data reporting capacity [1]
          • ⇐ so that larger data sets can be reported and analyzed [1]
      • the target is to have refreshes every hour or less [3]
    • data volume will be initially for about eight quarters of data [4]
    • extensibility will be supported with bring your own Fabric [4]
  • architecture
    • SaaS solution
      • {capability} immediate deployment 
        • businesses can start to analyze data and generate insights with minimal setup [1]
      • {capability} comprehensive reporting and dashboards
        • provides access to a wide range of preconfigured reports that cover multiple business functions [1]
      • {capability} near-real-time analytics 
        • future releases will offer more frequent data refreshes to enable near-real-time data analysis and reporting
      • {capability} predictive insights 
        • future releases will introduce predictive analytics capabilities that enable businesses to 
          • forecast trends
          • identify risks
          • seize opportunities [1]
      • {capability} user-friendly interface 
        • intuitive design ⇒ minimal training
          • fosters broader adoption 
          • enables a data-driven culture across the organization [1]
      • {capability} cost-effectiveness
        • available as part of D365 license
          • ⇒ provides advanced analytics without requiring significant investments in IT infrastructure [1]
    • DaaS solution
      • {capability} organizations can integrate its data models with their existing data warehousing infrastructure in Microsoft Fabric [1]
        • maximizes the value of existing data solutions [1]
        • positions businesses for future enhancements [1]
      • {capability} unified and scalable data models
        • customers can build custom models on top of a unified framework
          • ensures consistency and scalability across data sets [1]
      • {capability} future-proofing with automatic upgrades
        • data models integrate seamlessly with future D365 updates
          • reduces manual maintenance and ensures access to the latest features [1]
      • {capability} consistency and standardization
        • data models provide consistency and standardization across data sources
          • ensure high data quality and integrity [1]
      • {capability} advanced analytics and AI 
        • by customizing the data models, organizations can take advantage of advanced analytics and AI capabilities [1]
          • deeper insights without having to develop them from scratch [1]
      • {capability} enhanced data governance
        • unified data models support better data governance by providing standardized data definitions, relationships, and hierarchies [1]
          • ensure consistency and quality across the organization [1]
    • requires an integrated Power Platform environment [5]
      • must be integrated with the Microsoft Entra tenant [5]
    • uses shared Dataverse entitlements [1]
      • includes access to the data lake [1]
  • setup
    • dimensions
      • the selection of dimensions might affect the dimension groups that are created using these dimensions and the users who are assigned there [7]
        • e.g. legal entity, business unit
    • dimension groups
      • users can select specific values for the legal entity, or add a range of values [7]
        • selecting an invalid combination of dimension values, the dimension group will filter out all the records on the report [7]
      • {warning} assigning too many dimension groups to a user, slows the load for that user [7]
    • roles
      • determine which reports the user can access [7]
  • security
    • secure data through role-based access control on top of the value chains [7]
    • the first user who signs into the app is assigned the BPA admin role [7]
      • allows a user to access the administrator section of the BPA [7]
        • where the security can be set up [7]
      • has automatically assigned 
        • Microsoft report viewer role 
        • the All Access Dimension group [7]
          • allow the admin to see the data  in all the reports across all the dimensions [7]
    • {feature} dimension-based role-level security
      • ensures that users only see the data relevant to them based on their role
        •  confidently share reports without duplicating them
          • ⇐ data is automatically filtered by organization's security policies [3]
      • simple but powerful way to maintain control while providing access for teams that love working in Excel [3]
  • accessibility
    • can be accessed through either 
      • Power Platform
        • admins can access BPA app through PowerApps' makeup portal [6]
      • Dynamics 365
        • through the BPA preview shortcut in the homepage or the default dashboard [6]
        • for end users, the BPA preview shortcut is provided when they have certain duties associated to their role(s) [6]
  • licensing
    • included in D365 F&O license [4]
  • requirements
    • requires a tier two environment and Dynamics 365 finance version 1.0.38 or later [5]
  • {project} timeline
    • [2025 wave 1] backup and restore custom reports and analytics
      • {benefit} support better lifecycle management and empower customers to develop on sandbox instances before publishing to production [3]
    • 2025: available in all regions where F&O is available [3]
    • Oct-2024: GA

References:
[1] Microsoft Learn (2024) Dynamics 365 Finance: What is Business performance analytics? [link]
[2] Microsoft Learn (2025) Business performance analytics (BPA) with Dynamics 365 Finance [link]
[3] Dynamics 365 Finance - Business Performance Analytics 2025 Release Wave 1 Release Highlights [link]
[4] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 1 [link]
[5] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 2 [link]
[6] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 3 [link]
[7] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 4 [link]   
[8] Dynamics 365 Community (2024) Dynamics 365 Bites: Business Performance Analytics Part 5 [link]
[9] Microsoft Learn (2024) Dynamics 365: Business performance analytics introduction [link

Acronyms:
AI - Artificial Intelligence
BPA - Business Performance Analytics
D365 F&O - Dynamics 365 for Finance and Operations
DaaS - Data-as-a-Service
ETL - Extract, Transfer, Load
GA - General Availability
MF - Microsoft Fabric
PP - Public Preview
SaaS - Software-as-a-Service
SKU - Stock Keeping Unit
UTC - Coordinated Universal Time

01 March 2025

💎🤖💫SQL Reloaded: Copilot Stories (Part VI: Queries based on Dynamics 365 for Finance & Operations)

As part of ad-hoc queries, SSRS/Paginated Reports and Data Warehouse solutions, during the past 15 years I wrote hundreds of queries based on Dynamics AX, respectively on Dynamics 365 (D365) for Finance, Supply Chain, Production, Human Resources and a few other modules. So, given that the documentation for the D365 schema is available online, I've been wondering how easy would be to write a basic query using M365 Copilot. A relatively simple example would be the join between the Products (InventTable) and Prices (InventTableModule) tables, a quite simple thing to do (the joins were discussed in here).

Disclaimer: The below code was generated by Copilot via prompts used for demonstration purposes!

The first attempt proved to be far from the expectations:

Prompt:
Write a query that joins InventTable and InventTableModule to retrieve the Item Number, Description and the Item Prices from Dynamics 365 for Finance. Include tables' schema and rewrite the query by placing the comma first.

Output:

SELECT 
    , IT.ItemId AS [Item Number]
    , IT.ItemName AS [Description]
    , ITM.Price AS [Item Price]
FROM 
    dbo.InventTable IT
JOIN 
    dbo.InventTableModule ITM 
    ON IT.ItemId = ITM.ItemId 
    AND IT.DataAreaId = ITM.DataAreaId
WHERE 
    ITM.ModuleType = 0 -- Assuming 0 represents the sales module, adjust as necessary

A first version of the query didn't included the join on DataAreaId. Another problem is that compared with the previous versions, in D365 the ItemName is not available in the InventTable anymore while the ItemName was incorrectly considered as Description. The differences between the various versions of D365 are minimal, though the differences between D365 and AX 2009 are considerable with important changes due to redesign.

Prompt:
Rewrite the query to provide the other existing prices via a grouping.

Output:

SELECT 
    , IT.ItemId AS [Item Number]
    , IT.ItemName AS [Description]
    , STRING_AGG(CAST(ITM.Price AS NVARCHAR), ', ') AS [Item Prices]
FROM 
    dbo.InventTable IT
JOIN 
    dbo.InventTableModule ITM 
    ON IT.ItemId = ITM.ItemId 
    AND IT.DataAreaId = ITM.DataAreaId
WHERE 
    ITM.ModuleType = 0 -- Assuming 0 represents the sales module, adjust as necessary
GROUP BY 
    IT.ItemId, IT.ItemName

The generated query shows all the prices as a string, solution which might work if we change the delimiter, though is not an optimal solution at all.

Prompt:
Rewrite the query to provide the other existing individual prices via a grouping in separate fields.

Output:

SELECT 
    , IT.ItemId AS [Item Number]
    , IT.ItemName AS [Description]
    , MAX(CASE WHEN ITM.ModuleType = 0 THEN ITM.Price END) AS [Sales Price]
    , MAX(CASE WHEN ITM.ModuleType = 1 THEN ITM.Price END) AS [Purchase Price]
    , MAX(CASE WHEN ITM.ModuleType = 2 THEN ITM.Price END) AS [Cost Price]
FROM 
    dbo.InventTable IT
JOIN 
    dbo.InventTableModule ITM 
    ON IT.ItemId = ITM.ItemId 
    AND IT.DataAreaId = ITM.DataAreaId
GROUP BY 
    IT.ItemId, IT.ItemName

This version is closer to the original query I wrote some time ago:

-- Item prices via group by
SELECT ITD.DataAreaId 
, ITD.ItemId  
, ITD.InventPrice
, ITD.InventUnitId
, ITD.PurchPrice
, ITD.PurchUnitId
, ITD.SalesPrice
, ITD.SalesUnitId
FROM dbo.InventTable ITM
     LEFT JOIN (-- price details
     SELECT ITD.ItemId
     , ITD.DataAreaId 
     , Max(CASE ITD.ModuleType WHEN 0 THEN ITD.Price END) InventPrice
     , Max(CASE ITD.ModuleType WHEN 0 THEN ITD.UnitId END) InventUnitId
     , Max(CASE ITD.ModuleType WHEN 1 THEN ITD.Price END) PurchPrice
     , Max(CASE ITD.ModuleType WHEN 1 THEN ITD.UnitId END) PurchUnitId
     , Max(CASE ITD.ModuleType WHEN 2 THEN ITD.Price END) SalesPrice
     , Max(CASE ITD.ModuleType WHEN 2 THEN ITD.UnitId END) SalesUnitId
     FROM dbo.InventTableModule ITD
     GROUP BY ITD.ItemId
     , ITD.DataAreaId 
    ) ITD
       ON ITD.ItemId = ITM.ItemId
      AND ITD.DataAreaId = ITM.DataAreaId
WHERE ITD.DataAreaID = 'XYZ'
ORDER BY ITD.ItemId

Prompt:
Rewrite the query to provide the other existing individual prices via a grouping in separate fields.

Output:

SELECT 
    , IT.ItemId AS [Item Number]
    , IT.ItemName AS [Description]
    , Sales.Price AS [Sales Price]
    , Purchase.Price AS [Purchase Price]
    , Cost.Price AS [Cost Price]
FROM 
    dbo.InventTable IT
LEFT JOIN 
    dbo.InventTableModule Sales 
    ON IT.ItemId = Sales.ItemId 
    AND IT.DataAreaId = Sales.DataAreaId 
    AND Sales.ModuleType = 0 -- Assuming 0 represents the sales module
LEFT JOIN 
    dbo.InventTableModule Purchase 
    ON IT.ItemId = Purchase.ItemId 
    AND IT.DataAreaId = Purchase.DataAreaId 
    AND Purchase.ModuleType = 1 -- Assuming 1 represents the purchase module
LEFT JOIN 
    dbo.InventTableModule Cost 
    ON IT.ItemId = Cost.ItemId 
    AND IT.DataAreaId = Cost.DataAreaId 
    AND Cost.ModuleType = 2 -- Assuming 2 represents the cost module

The structure of the query looks promising, though same in the previous query Copilot got the uses for ModuleType wrong (see below)! And here's the query I wrote previously:

-- Item Prices without grouping
SELECT ITM.DataAreaId 
, ITM.ItemId 
, ILP.UnitId InventUnitId 
, ILP.Price InventPrice 
, IPP.UnitId PruchaseUnitId
, IPP.Price PurchasePrice
, ISP.UnitId SalesUnitId
, ISP.Price SalesPrice
FROM dbo.InventTable ITM
      LEFT JOIN dbo.InventTableModule ILP
        ON ITM.ItemId = ILP.ItemId
       AND ITM.DataAreaId = ILP.DataAreaId
       AND ILP.ModuleType = 0 -- Warehouse
      LEFT JOIN dbo.InventTableModule IPP
        ON ITM.ItemId = IPP.ItemId
       AND ITM.DataAreaId = IPP.DatareaId 
       AND IPP.ModuleType = 1 -- Purchases
      LEFT JOIN dbo.InventTableModule ISP
        ON ITM.ItemId = ISP.ItemId
       AND ITM.DataAreaId = ISP.DataAreaId 
       AND ISP.ModuleType = 2 -- Sales	
WHERE ITM.DataAreaId = 'XYZ'

Probably the Copilot for Dynamics 365 Finance and Operations [1] works much better than the one from M365. Unfortunately, I don't have access to it yet! Also, if I would invest more time in the prompts the results would be closer to the queries I wrote. It depends also on the goal(s) considered - build a skeleton on which to build the logic, respectively generate the final query via the prompts. Probably the 80-20 rule applies here as well. 

Frankly, for a person not knowing the D365 data model, the final queries generated are a good point to start (at least for searching for more information on the web) as long Copilot got the prompts right. It will be also interesting to see how business rules related to specific business processes (including customizations) will be handled. The future looks bright (at least for the ones still having a job in the field)!

Happy coding!

Previous Post <<||>> Next Post

References:
[1] Microsoft Learn (2024) Microsoft Learn: Overview of Copilot capabilities in finance and operations apps [link]

25 December 2024

💎🏭SQL Reloaded: Number of Records IV (via sys.partitions DMV)

To get the exact number of records in a table one can use the COUNT (see post) or the more recent COUNT_BIG function, though for big tables this can be an inefficient operation for the database engine:

-- number of records via COUNT
SELECT count(*) row_count
FROM SalesLT.Product

Moreover, sometimes the operation needs to be repeated for a number of tables, e.g. dropdown tables in Dynamics 365 for Finance and Operations (D365 F&O). Writing the query as a UNION allows to export the data as a single table and do comparisons (e.g. in Excel). The same approach can be used also when multiple columns are used for grouping, though one must account for the additional columns in the other subqueries. However, the more tables are involved, the more difficult it becomes to maintain the query over time. 
 
-- number of records via COUNT for multiple tables
SELECT 'SalesLT.Product' table_name
, count(*) row_count
FROM SalesLT.Product
UNION ALL
SELECT 'SalesLT.ProductDescription' table_name
, count(*) row_count
FROM SalesLT.ProductDescription
UNION ALL
SELECT 'SalesLT.ProductModel' table_name
, count(*) row_count
FROM SalesLT.ProductModel

There are many scenarios in which it's needed to get an approximate of the number of records available in a table and doing a record count might prove to be too expensive. For a quick and dirty solution one can use the sys.partitions DMV  instead:

-- number of records via DMV for single object
SELECT object_id
, OBJECT_NAME(object_id) object_name
, OBJECT_SCHEMA_NAME(object_id) schema_name
, SUM(Rows) AS row_count
, data_compression_desc AS compression_type
, COUNT(*) partitions_count
FROM sys.partitions 
WHERE index_id < 2 --ignore the partitions from the non-clustered index if any
  AND OBJECT_ID('SalesLT.Product') = object_id
GROUP BY object_id
, data_compression_desc
ORDER BY row_count DESC;

The query is based on sys.partitions table [1] which contains a row for each partition of all the tables and most types of indexes in the database. The documentation mentions that "rows" indicates the approximate number of rows in the considered partition.

Alternatively, one can bring more tables into the query to extend its range of applicability. 

-- number of records via DMVs
SELECT S.name + '.' + T.name SearchName
, S.Name SchemaName
, T.name TableName
, P.row_count
, P.compression_type
, P.partitions_count
FROM sys.tables T
     LEFT JOIN (
        SELECT object_id
        , SUM(Rows) AS row_count
        , data_compression_desc AS compression_type
        , COUNT(*) partitions_count
        FROM sys.partitions 
        WHERE index_id < 2 --ignore the partitions from the non-clustered index if any
        --AND OBJECT_ID('SalesLT.Product') = object_id
        GROUP BY object_id
        , data_compression_desc
     ) P
    ON T.object_id = P.object_id
     JOIN sys.schemas as S
	   on S.schema_id = T.schema_id
WHERE S.Name = 'SalesLT'
  AND T.Name LIKE 'Product%'
ORDER BY row_count DESC;

The data can be exported regularly to give an idea how tables' cardinality changes over time. One can find this useful as part of the loading process in data warehouses or other solutions (e.g. data migrations). 

By using a FULL JOIN instead of a LEFT JOIN one can retrieve only the tables that have records. 

One should consider only the tables in scope, and eventually remove the records associated with the system objects (e.g. sys or information_schema upon case).

 -- constraints to be added in the WHERE clause to remove the records related to system objects
 AND OBJECT_NAME(object_id) NOT LIKE 'sys%'
 AND OBJECT_NAME(object_id) NOT LIKE 'queue_%' 
 AND OBJECT_NAME(object_id) NOT LIKE 'filestream_tombstone%' 

There are also scenarios in which the count is needed only for a subset of the data. It's the case of D365 F&O (in which the number of records is needed by DataAreaId (aka company) or another field. A solution can be built using the sp_MSForEachTable stored procedure (see the last query from this post) and a cursor.

Notes:
1) Unfortunately, in Microsoft Fabric warehouses the sys.partitions.rows is 0 for all user tables and currently also the sp_MSForEachTable stored procedure can't be used to retrieve the number of records for all tables. However, one can create an old-fashioned cursor for iterating though the collection of tables in scope.
2) The code used in this post is available also in the GitHub repository.

Happy coding and Merry Christmas!

Previous Post <<||>> Next Post

Resources:
[1] Microsoft Learn (2024) sys.partitions (Transact-SQL) [link]
[2] Microsoft Learn (2024) COUNT_BIG (Transact-SQL) [link]

04 March 2024

🧭🏭Business Intelligence: Microsoft Fabric (Part II: Domains and the Data Mesh I -The Challenge of Structure Matching)

Business Intelligence Series
Business Intelligence Series

The holy grail of building a Data Analytics infrastructure seems to be nowadays the creation of a data mesh, a decentralized data architecture that organizes data by specific business domains. This endeavor proves to be difficult to achieve given the various challenges faced  – data integration, data ownership, data product creation and ownership, enablement of data citizens, respectively enforcing security and governance in a federated manner. 

Microsoft Fabric promises to facilitate the creation of data mashes with the help of domains and subdomain by providing built-in security, administration, and governance features associated with them. A domain is a way of logically grouping together all the data in an organization that is relevant to a particular area or field. A subdomain is a way for fine tuning the logical grouping of the data.

Business domains
Business domains & their entities

At high level the challenge of building a data mesh is on how to match or aggregate structures. On one side is the high-level structure of the data mesh, while on the other side is the structure of the business data entities. The data entities can be grouped within a taxonomy with multiple levels that expands to the departments. That’s why it seems somehow natural to consider the departments as the top-most domains of the data mesh. The issue is that if the segmentation starts from a high level, iI becomes inflexible in modeling. Moreover, one has only domains and subdomains, and thus a 2-level structure to model the main aspects of the data mesh.

Some organizations allow unrestricted access to the data belonging to a given department, while others breakdown the access to a more granular level. There are also organizations that don’t restrict the access at all, though this may change later. Besides permissions and a way of grouping together the entities, what value brings to set the domains as departments? 

Therefore, I’m not convinced about using an organizations’ departmental structure as domains, especially when such a structure may change and this would imply a full range of further changes. Moreover, such a structure doesn’t reflect the span of processes or how permissions are assigned for the various roles, which are better reflected on how information systems are structured. Most probably the solution needs to accommodate both perspective and be somehow in the middle. 

Take for example the internal structure of the modules from Dynamics 365 (D365). The Finance area is broken down in Accounts Payable, Accounts Receivables, Fixed Assets, General Ledger, etc. In some organizations the departments reflect this delimitation to some degree, while in others are just associated with finance-related roles. Moreover, the permissions are more granular and, reflecting the data entities the users work with. 

Conversely, SCM extends into Finance as Purchase orders, Sales orders and other business documents are the starting or intermediary points of processes that span modules. Similarly, there are processes that start in CRM or other systems. The span of processes seem to be more appropriate for structuring the data mesh, though the system overlapping with the roles involved in the processes and the free definition of process boundaries can overcomplicate the whole design.

It makes sense to define the domains at a level that resembles the structure of the modules available in D365, while the macro data-entities represent the subdomain. The subdomain would represent then master as well as transactional data entities from the perspective of the domains, with there will be entities that need to be shared between multiple domains. Such a structure has less chances to change over time, allowing more flexibility and smaller areas of focus and thus easier to design, develop, test, deploy and maintain.

Previous Post <<||>> Next Post

10 January 2024

📦Data Migrations (DM): Conceptualization (Part VIII: Create a Data Migration Strategy for Dynamics 365)

Data Migration
Data Migration Series

Microsoft made available a module on creating a Data Migration strategy (DMS) for Dynamics 365 solutions [1] and provides a template for the DMS workshop on GitHub. Despite the all-encompassing title, the module focuses mainly on the latter. Here are several points to consider in addition.

The workshop allows to gather most important aspects related to the DM, however it's useful to make the strategy explicit as part of a document and share it with the team. The DMS document should follow the general elaboration of a strategy by providing information about the current and targeted state, goals and objectives, assumptions, dependencies, risks and opportunities, the architecture (including alternatives that need to be explored), direct and interfacing processes, roles involved and their responsibilities, communication paths (including escalation), etc. 

Even if the template discusses general aspects, the treatment of best practices is minimal (see some technical and architectural aspects).

The DMS should describe when possible the high-level architecture planned for DM. The architecture section can be updated as the project progresses. 

One can plan for 3-4 dry-runs plus the UAT, though further iterations might be needed depending on the data quality, respectively the complexity of the logic needed. Each dry-run will have its own scope (in terms of entities and number of records) and objectives (in terms of passed tests), while the UAT attempts to mimic the same conditions the as DM for Go-Live.

Several iterations are usually needed to identify the entities in scope. Moreover, the scope will suffer several changes as new entities and migration areas are identified. There are high the chances that some data entities will be forgotten or postponed after Go Live. Upon case, the business might approach a phased implementation, each with their own DM,

It's useful to have a list of the entities available in the organization and map them to the entities available in the target system(s). This allows to identify the gaps on the DM side. There will be entities for which data need to be provided, entities for which there's no counterpart in the target system, respectively entities with different levels of detail.

An important step of the process is defining the business rules that apply - what subset of data is migrated (e.g. active referenced master data, open transactions, etc.). The definitions will suffer further changes as business perception about what's needed changes.

As they'll go through multiple changes, it's useful to have such lists in Excel or similar repositories and reference them in the strategy and further documents. 

The strategy can also target the post-Go-Live migration activities, though the process and architecture might be slightly different, even if many of the requirements still apply. One can devise the further aspects as part of the DMS or consider them as part of the Data Management strategy and operations.

 Despite the considerable overlapping of the DM with Data Management, at least until now the DM is not considered as part of the latter. Big organizations with periodic DMs might need to consider a broader strategy, following to address punctually the deviations from each project. 

The DM is usually considered as a subproject of the implementation and should be managed as such. Depending on its scope and characteristics (principles, acceptance criteria) the DM might use a different methodology and timeline than the main project. Defining project's goals/objectives, outcomes, stakeholders, timeline, phases, and milestones, respectively the high-level assumptions, dependencies, risks, opportunities and constraints is Project Management 101.

The DM is critical for an implementation's success. It would be useful to describe the architectural alternatives, as well to provide a series of architectural recommendations.

Previous Post <<||>> Next Post

Resources:
[1] Microsoft learn (2023) Create a data migration strategy for Dynamics 365 solutions (link)

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 25 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.