Showing posts sorted by date for query Data management. Sort by relevance Show all posts
Showing posts sorted by date for query Data management. Sort by relevance Show all posts

17 June 2025

🤖〽️Prompt Engineering: Copilot Unabridged (Part 24: The Next Steps in the Development of AI)

Prompt Engineering Series

Prompt: "write a blogpost of 600 words on the next steps in the development of AI"

Introduction

Artificial Intelligence (AI) is evolving at an unprecedented pace, shaping industries, economies, and daily life. As AI continues to advance, researchers, businesses, and governments are focusing on the next steps in its development - from improving efficiency to addressing ethical concerns. This blog post explores the key areas driving AI’s future growth and the innovations that will define the next phase of AI evolution.

1. Advancing AI Infrastructure and Computing Power

AI’s capabilities depend on high-performance computing, and the next phase of development will focus on enhancing AI infrastructure. Companies and governments are investing in AI supercomputers to support large-scale AI applications.

For example, Ericsson, AstraZeneca, SAAB, SEB, and Wallenberg Investments have formed a consortium to build Sweden’s largest AI supercomputer, powered by Nvidia’s DGX SuperPODs. This infrastructure will enable AI-driven advancements in network intelligence, drug discovery, defense systems, and financial services.

2. AI for Secure and Ethical Software Development

As AI becomes more integrated into software development, security concerns are growing. Companies like Snyk are launching AI-native platforms to govern and secure AI-driven software. These platforms aim to:

  • Reduce AI-generated security vulnerabilities in software development.
  • Enhance governance and policy enforcement for AI applications.
  • Improve AI security posture management to prevent cyber threats.

Ensuring AI is secure and ethically developed will be a major focus in the coming years.

3. AI in Data Analytics and Business Intelligence

AI is revolutionizing data analytics, enabling businesses to process information faster and more efficiently. dbt Labs has introduced the dbt Fusion engine, designed to optimize AI-driven analytics and reduce cloud costs. This innovation will:

  • Accelerate AI-powered data processing for enterprises.
  • Improve AI-driven decision-making in business intelligence.
  • Enhance AI’s ability to manage large-scale data pipelines.

AI’s role in business intelligence and analytics will continue to expand, making data-driven strategies more effective.

4. AI for National and Regional Development

Governments and corporations are investing in AI to drive national and regional innovation. Safaricom, for example, has pledged $500 million to develop AI infrastructure in East Africa. This investment will support:

  • AI-powered data centers for regional businesses.
  • Edge computing capabilities to improve AI efficiency.
  • Digital skills programs to train local developers in AI technologies.

AI’s development is becoming a global priority, with nations investing in AI to boost economic growth and technological leadership.

5. AI’s Role in Scientific Discovery and Automation

AI is transforming scientific research by accelerating discoveries in medicine, engineering, and environmental science. Future AI advancements will focus on:

  • AI-driven drug discovery to develop new treatments faster.
  • AI-powered automation in manufacturing to improve efficiency.
  • AI-enhanced climate modeling to predict environmental changes.

AI’s ability to process vast amounts of data will continue to drive breakthroughs in multiple fields.

Conclusion: AI’s Future Is Shaped by Innovation and Responsibility

The next steps in AI development will focus on enhancing infrastructure, securing AI-driven software, optimizing data analytics, expanding AI’s role in national development, and advancing scientific discovery. As AI continues to evolve, responsible innovation and ethical governance will be crucial to ensuring AI benefits society.

Disclaimer: The whole text was generated by Copilot at the first attempt. This is just an experiment to evaluate feature's ability to answer standard general questions, independently on whether they are correctly or incorrectly posed. Moreover, the answers may reflect hallucinations and other types of inconsistent or incorrect reasoning.

Previous Post <<||>> Next Post

09 June 2025

🤖〽️Prompt Engineering: Copilot Unabridged (Part 16: How AI Can Use Its Intelligence to Help Society)

Prompt Engineering Series
Prompt Engineering Series

Prompt: "write a blogpost of 600 words on how AI can use its intelligence to help society"

Introduction

Artificial Intelligence (AI) is reshaping industries, solving complex problems, and unlocking new possibilities in science, business, and daily life. While AI comes with challenges, its potential to positively impact society is undeniable. From healthcare advancements to environmental sustainability, AI can use its intelligence to empower individuals, strengthen communities, and create a better future.

1. AI Revolutionizing Healthcare and Medicine

AI-driven tools are transforming healthcare by improving diagnostics, enhancing treatment plans, and enabling personalized medicine. Some of the ways AI benefits healthcare include:

  • Early disease detection: AI-powered medical imaging can identify patterns in X-rays, MRIs, and CT scans that humans might miss.
  • Predictive analytics: AI helps anticipate disease outbreaks by analyzing health trends and patient data.
  • Personalized treatment plans: AI tailors medications and therapies based on individual genetic profiles for better outcomes.

AI’s ability to process vast amounts of medical data makes healthcare more efficient, accessible, and accurate, ultimately saving lives.

2. AI Enhancing Education and Learning

AI is revolutionizing education by personalizing learning experiences and supporting educators. Key benefits include:

  • Smart tutoring systems: AI adapts lessons based on a student’s strengths and weaknesses.
  • Automated grading: AI streamlines assessments, giving teachers more time for one-on-one interaction.
  • Language translation: AI breaks barriers, making education globally accessible.

AI ensures education is more inclusive, adaptive, and engaging, helping learners reach their full potential.

3. AI Tackling Climate Change and Environmental Issues

AI plays a crucial role in environmental sustainability by providing data-driven insights into climate change, conservation efforts, and resource management. AI-driven solutions include:

  • Optimizing energy consumption: AI manages electricity grids for efficiency, reducing waste.
  • Wildlife protection: AI analyzes satellite images to track endangered species and prevent illegal deforestation.
  • Climate modeling: AI predicts environmental changes, helping scientists develop proactive solutions.

AI can monitor, predict, and mitigate environmental challenges, contributing to a healthier planet.

4. AI Advancing Social Good and Humanitarian Efforts

AI supports humanitarian efforts by improving disaster response, combating misinformation, and fostering inclusion. Notable AI contributions include:

  • Disaster relief optimization: AI predicts hurricanes, earthquakes, and floods, enabling faster emergency response.
  • Combating misinformation: AI detects fake news and disinformation, protecting public discourse.
  • Enhancing accessibility: AI-powered speech recognition assists people with disabilities, enabling greater participation.

AI strengthens societies by improving communication, accessibility, and crisis management.

5. AI Driving Economic Growth and Innovation

Businesses and industries thrive on AI’s ability to optimize workflows, reduce inefficiencies, and accelerate innovation. AI promotes economic benefits such as:

  • Smarter supply chain management: AI predicts demand fluctuations, improving production efficiency.
  • AI-driven research: AI accelerates scientific discoveries in medicine, engineering, and material sciences.
  • Job creation in AI fields: AI generates careers in data science, software engineering, and AI ethics.

AI boosts productivity across industries, ensuring economic resilience and technological advancement.

Conclusion: AI’s Role in Shaping a Better Future

AI is more than just technology - it’s a tool for empowerment, innovation, and sustainability. By harnessing AI’s intelligence responsibly, society can improve healthcare, education, environmental protection, humanitarian efforts, and economic growth.

The key to maximizing AI’s benefits lies in ethical development, transparency, and collaboration, ensuring AI serves humanity positively and equitably.

Disclaimer: The whole text was generated by Copilot at the first attempt. This is just an experiment to evaluate feature's ability to answer standard general questions, independently on whether they are correctly or incorrectly posed. Moreover, the answers may reflect hallucinations and other types of inconsistent or incorrect reasoning.

Previous Post <<||>> Next Post

29 May 2025

🤖〽️Prompt Engineering: Copilot Unabridged (Part 5: Why There Are Limits to AI Growth)

Prompt Engineering Series

Prompt: "write a blogpost of 600 words on why there are limits to growth for AI"

Introduction

Artificial Intelligence (AI) is evolving rapidly, transforming industries and redefining how humans interact with technology. However, despite its advancements, AI is not without limitations. While many claim that AI’s potential is limitless, several fundamental constraints affect its development, usability, and ethical application. In this blog post, we explore why AI growth is not boundless and the factors that restrict its expansion.

Computational and Hardware Constraints

AI depends heavily on computational power to process large amounts of data efficiently. While improvements in processing speeds, cloud computing, and quantum computing are promising, hardware limitations remain a bottleneck.

  • Energy consumption: Training sophisticated AI models requires immense energy, leading to high costs and environmental concerns.
  • Hardware limitations: Even cutting-edge processors (GPUs, TPUs) struggle with increasingly complex AI models, limiting performance scalability.
  • Data storage needs: AI requires vast amounts of storage and memory, which can be difficult to scale beyond current technologies.

Despite ongoing advancements, these factors impose real-world limitations on AI expansion, preventing it from growing indefinitely.

Dependence on Data and Human Oversight

AI is fundamentally data-driven, meaning its accuracy and capabilities are only as good as the data it receives. However, data availability and quality present major challenges:

  • Data scarcity: Not all industries generate sufficient high-quality datasets, limiting AI effectiveness.
  • Bias in data: AI models can inherit human biases, leading to ethical concerns and flawed decision-making.
  • Human intervention: AI requires constant supervision for algorithm adjustments, data corrections, and error management.

Since AI models learn from existing human-created data, they are restricted by data inconsistencies, biases, and inaccuracies - creating natural growth barriers.

Ethical and Legal Restrictions

AI operates within a world governed by laws, regulations, and ethical constraints, which significantly limit its unrestricted development:

  • Privacy concerns: AI depends on personal data for optimization, but privacy regulations (GDPR, CCPA) restrict unchecked data usage.
  • Security risks: AI-driven automation increases the risk of hacking, misinformation, and deepfake fraud, requiring strict oversight.
  • Job displacement worries: Governments and institutions limit AI adoption to protect workers and avoid societal disruption.

As AI continues to advance, legal frameworks must adapt - but regulatory boundaries will always exist to prevent ethical breaches.

AI’s Lack of True Consciousness

Unlike humans, AI does not possess consciousness, intuition, or creativity in the way people do. While AI mimics intelligence, it lacks genuine understanding:

  • No self-awareness: AI does not truly “think”; it only processes patterns based on data inputs.
  • Limited creativity: AI can generate art and music, but it does not comprehend emotion, cultural context, or abstract thought.
  • Inability to make ethical judgments: AI lacks moral reasoning and cannot replace human values or ethical decision-making.

This inherent lack of cognitive ability prevents AI from achieving true autonomy, reinforcing its growth limitations.

Final Thoughts: AI’s Evolution Has Limits

While AI continues to expand, it is not limitless due to constraints in hardware, data, ethics, and intelligence itself.

AI can optimize processes, enhance decision-making, and revolutionize industries, but it will always require human oversight and face technical and ethical barriers.

Rather than seeing AI as boundless, we should focus on responsible development, ensuring it remains a powerful tool rather than an unchecked force.

Disclaimer: The whole text was generated by Copilot at the first attempt. This is just an experiment to evaluate feature's ability to answer standard general questions, independently on whether they are correctly or incorrectly posed. Moreover, the answers may reflect hallucinations and other types of inconsistent or incorrect reasoning.

Previous Post <<||>> Next Post

23 May 2025

🏭🗒️Microsoft Fabric: Warehouse Snapshots [Notes] 🆕

Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)! 

Last updated: 23-May-2025

[Microsoft Fabric] Warehouse Snapshots

  • {def} read-only representation of a warehouse at a specific point in time [1]
  • allows support for analytics, reporting, and historical analysis scenarios without worrying about the volatility of live data updates [1]
    • provide a consistent and stable view of data [1]
    • ensuring that analytical workloads remain unaffected by ongoing changes or ETL  operations [1]
  • {benefit} guarantees data consistency
    • the dataset remains unaffected by ongoing ETL processes [1]
  • {benefit} immediate roll-Forward updates
    • can be seamlessly rolled forward on demand to reflect the latest state of the warehouse
      • ⇒ {benefit} consumers access the same snapshot using a consistent connection string, even from third-party tools [1]
      • ⇐ updates are applied immediately, as if in a single, atomic transaction [1]
  • {benefit} facilitates historical analysis
    • snapshots can be created on an hourly, daily, or weekly basis to suit their business requirements [1]
  • {benefit} enhanced reporting
    • provides a point-in-time reliable dataset for precise reporting [1]
      • ⇐ free from disruptions caused by data modifications [1]
  • {benefit} doesn't require separate storage [1]
    • relies on source Warehouse [1]
  • {limit} doesn't support database objects 
  • {limit} capture a state within the last 30 days
  • {operation} create snapshot
    • via New warehouse snapshot
    • multiple snapshots can be created for the same parent warehouse [1]
      • appear as child items of the parent warehouse in the workspace view [1]
      • the queries run against provide the current version of the data being accessed [1]
  • {operation} read properties 
    • via 
    • GET https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/items/{warehousesnapshotId} Authorization: Bearer <bearer token>
  • {operation} update snapshot timestamp
    • allows users to roll forward data instantly, ensuring consistency [1]
      • use current state
        • via ALTER DATABASE [<snapshot name>] SET TIMESTAMP = CURRENT_TIMESTAMP; 
      • use point in time
        • ALTER DATABASE snapshot SET TIMESTAMP = 'YYYY-MM-DDTHH:MM:SS.SS'//UTC time
    • queries that are in progress during point in time update will complete against the version of data they were started against [1]
  • {operation} rename snapshot
  • {operation} delete snapshot
    • via DELETE
    • when the parent warehouse gets deleted, the snapshot is also deleted [1]
  • {operation} modify source table
    • DDL changes to source will only impact queries in the snapshot against tables affected [1]
  • {operation} join multiple snapshots
    • the resulting snapshot date will be applied to each warehouse connection [1]
  • {operation} retrieve metadata
    • via sys.databases [1]
  • [permissions] inherited from the source warehouse [1]
    • ⇐ any permission changes in the source warehouse applies instantly to the snapshot [1]
    • security updates on source database will be rendered immediately to the snapshot databases [1]
  • {limitation} can only be created against new warehouses [1]
    • created after Mar-2025
  • {limitation} do not appear in SSMS Object Explorer but will show up in the database selection dropdown [1]
  • {limitation} datetime can be set to any date in the past up to 30 days or database creation time (whichever is later)  [1]
  • {limitation} modified objects after the snapshot timestamp become invalid in the snapshot [1]
    • applies to tables, views, and stored procedures [1]
  • {limitation} must be recreated if the data warehouse is restored [1]
  • {limitation} aren’t supported on the SQL analytics endpoint of the Lakehouse [1]
  • {limitation} aren’t supported as a source for OneLake shortcuts [1]
  •  [Power BI]{limitation} require Direct Query or Import mode [1]
    • don’t support Direct Lake

    References:
    [1] Microsoft Learn (2025) Fabric: Warehouse Snapshots in Microsoft Fabric (Preview) [link]
    [2] Microsoft Learn (2025) Warehouse snapshots (preview) [link]
    [3] Microsoft Learn (2025) Create and manage a warehouse snapshot (preview) [link]

    Resources:


    Acronyms:
    DDL - Data Definition Language
    ETL - Extract, Transfer, Load
    MF - Microsoft Fabric
    SSMS - SQL Server Management Studio

    20 May 2025

    🤖〽️Prompt Engineering: Copilot Unabridged (Part 2: The Importance of AI in Society - A Transformational Force II)

    Prompt Engineering Series
    Prompt Engineering Series

    Prompt: "write a blogpost of 600 words on the importance of AI in society"

    Introduction

    Artificial Intelligence (AI) has rapidly evolved from a futuristic dream to an integral part of everyday life. Its presence is felt in industries ranging from healthcare to finance, transportation to entertainment. While some view AI as a tool of convenience, others recognize it as a force capable of reshaping the very fabric of society. As we continue integrating AI into our world, its impact - both promising and challenging - deserves careful consideration.

    Revolutionizing Industries

    AI’s ability to process vast amounts of data and uncover patterns has revolutionized multiple industries. In healthcare, AI-driven diagnostics assist doctors in identifying diseases like cancer with unparalleled precision. Medical imaging powered by AI can detect abnormalities that might otherwise be overlooked. Moreover, AI applications in drug discovery accelerate research, leading to faster development of treatments.

    In finance, AI-powered algorithms analyze market trends, optimize investments, and even detect fraudulent transactions. Companies utilize AI to enhance customer service through chatbots that provide instant responses and personalized recommendations. Meanwhile, AI-driven automation boosts efficiency in manufacturing by streamlining processes and reducing human error.

    Transportation is another domain benefiting from AI. Autonomous vehicles, once thought to be purely speculative, are now in active development, promising a future with safer roads and reduced congestion. AI-driven traffic management systems enhance urban mobility, making cities more efficient and eco-friendly.

    Enhancing Accessibility and Communication

    One of AI’s most profound societal contributions is its role in accessibility. AI-powered speech recognition and text-to-speech tools empower individuals with disabilities, allowing seamless interaction with technology. AI-driven language translation facilitates cross-cultural communication, bridging gaps between people and businesses worldwide.

    In education, AI-powered tutoring systems adapt to students’ learning styles, providing personalized guidance and support. AI can analyze student performance, identify areas for improvement, and recommend tailored resources. This transformative approach makes education more inclusive and effective, breaking down traditional barriers.

    Addressing Ethical and Societal Concerns

    Despite AI’s remarkable benefits, its advancement raises ethical concerns. Bias in AI algorithms is a pressing issue, as data-driven models can inadvertently reflect and perpetuate existing societal inequalities. The responsibility falls on developers to ensure fairness and transparency in AI systems, fostering an equitable digital landscape.

    Another concern is the impact of automation on employment. While AI creates new job opportunities in tech-driven industries, it also replaces traditional roles, sparking debates about workforce adaptation. The key lies in preparing society for AI’s integration by promoting education in AI-related fields and encouraging continuous skill development.

    Privacy and security also come into focus as AI algorithms collect and analyze vast amounts of personal data. Safeguarding user privacy while leveraging AI’s capabilities requires stringent regulations and ethical AI development.

    Embracing a Future Powered by AI

    AI’s influence on society is undeniable - it has transformed industries, empowered individuals, and presented new opportunities. However, navigating AI’s complexities requires thoughtful implementation and ethical considerations. As we move forward, collaboration between policymakers, researchers, and the public will be crucial in ensuring AI benefits humanity responsibly and inclusively.

    The future of AI is not just about innovation; it is about shaping a world where technology uplifts rather than divides. By fostering awareness, ethical development, and responsible use, we can harness AI as a tool for progress, paving the way for a society that embraces both human intelligence and artificial intelligence in harmony.

    Disclaimer: The whole text was generated by Copilot (under Windows 10) at the first attempt. This is just an experiment to evaluate feature's ability to answer standard general questions, independently on whether they are correctly or incorrectly posed. Moreover, the answers may reflect hallucinations and other types of inconsistent or incorrect reasoning.

    Previous Post <<||>> Next Post

    🤖〽️Prompt Engineering: Copilot Unabridged (Part 1: The Importance of AI in Society - A Transformational Force)

    Prompt Engineering Series
    Prompt Engineering Series

    Prompt: "write a blogpost of 600 words on the importance of AI in society"

    Introduction

    Artificial Intelligence (AI) has evolved from a futuristic concept to a daily reality that shapes industries, economies, and even personal lives. Once confined to science fiction, AI now drives technological progress and redefines the way we interact with the world. In this blog post, we will explore the significance of AI, how it influences various sectors, and the ethical considerations accompanying its rapid adoption.

    AI: The Backbone of Modern Innovation

    At its core, AI enables machines to perform tasks that typically require human intelligence - such as learning, reasoning, problem-solving, and decision-making. From automating repetitive tasks to augmenting human capabilities, AI has proven indispensable across multiple domains.

    Take healthcare, for example. AI-powered algorithms analyze complex medical data to assist doctors in diagnosing diseases, predicting patient outcomes, and even recommending personalized treatment plans. Innovations like AI-driven robotic surgery and early cancer detection systems save lives and enhance the precision of medical procedures.

    In finance, AI plays a crucial role in fraud detection, risk assessment, and automated trading. Financial institutions leverage machine learning models to analyze spending patterns, prevent cyberattacks, and provide tailored investment advice. AI’s ability to process massive amounts of data ensures accuracy and efficiency in financial operations.

    Retail and e-commerce have also been transformed. AI-driven recommendation engines predict customer preferences based on purchasing behavior, allowing businesses to offer personalized shopping experiences. Chatbots streamline customer service, while automated inventory management optimizes supply chain efficiency. AI ensures that companies stay competitive in an increasingly digital marketplace.

    AI and the Workplace: Automation vs. Augmentation

    The integration of AI into the workplace sparks ongoing debates about its impact on jobs. While automation replaces certain manual tasks, it also opens doors for new career opportunities. AI-driven tools enable professionals to focus on creative and strategic work rather than repetitive processes.

    For instance, AI-powered analytics help marketers assess consumer sentiment and predict trends with unprecedented accuracy. Similarly, AI assists engineers and researchers in designing smarter infrastructure, accelerating scientific discoveries, and optimizing business operations.

    Rather than eliminating jobs, AI redefines them. Organizations must invest in upskilling employees to harness AI effectively, fostering a collaborative environment where human creativity complements machine intelligence.

    Ethical Considerations and Responsible AI Development

    With great power comes great responsibility. As AI grows more sophisticated, ethical concerns regarding data privacy, bias, and transparency become more pressing.

    AI systems rely on large datasets to function, raising concerns about privacy and security. Governments and corporations must establish stringent regulations to protect user information from misuse or exploitation. Additionally, biases in AI models can lead to discriminatory outcomes, necessitating fair and inclusive development practices.

    Transparency is another critical factor. Users should understand how AI-driven decisions are made, particularly in sensitive areas like healthcare and criminal justice. Ethical AI implementation requires collaboration between policymakers, technologists, and the public to ensure that advancements benefit society as a whole.

    The Future of AI: A Balance Between Progress and Ethics

    The importance of AI in society cannot be understated. It enhances productivity, improves decision-making, and solves complex problems across industries. However, its responsible use is paramount to prevent unintended consequences.

    As AI continues to evolve, collaboration between academia, businesses, and governments is necessary to establish ethical guidelines and maximize its benefits. The goal should not be to replace human intelligence but to amplify it, fostering innovation while ensuring accountability.

    Ultimately, AI is more than just a tool - it is a transformative force shaping the future. Harnessing its potential wisely will determine whether it serves as a steppingstone toward progress or a challenge requiring careful navigation.

    Disclaimer: The whole text was generated by Copilot at the first attempt. This is just an experiment to evaluate feature's ability to answer standard general questions, independently on whether they are correctly or incorrectly posed. Moreover, the answers may reflect hallucinations and other types of inconsistent or incorrect reasoning.

    |>> Next Post

    18 May 2025

    #️⃣Software Engineering: Mea Culpa (Part VII: A Look Forward)

    Software Engineering Series
    Software Engineering Series

    I worked for more than 20 years in various areas related to ERP systems - Data Migrations, Business Intelligence/Analytics, Data Warehousing, Data Management, Project Management, (data) integrations, Quality Assurance, and much more, having experience with IFS IV, Oracle e-Business Suite, MS Dynamics AX 2009 and during the past 3-7 years also with MS Dynamics 365 Finance, SCM & HR (in that order). Much earlier, I started to work with SQL Server (2000-2019), Oracle, and more recently with Azure Synapse and MS Fabric, writing over time more than 800 ad-hoc queries and reports for the various stakeholders, covering all the important areas, respectively many more queries for monitoring the various environments. 

    In the areas where I couldn’t acquire experience on the job, I tried to address this by learning in my free time. I did it because I take seriously my profession, and I want to know how (some) things work. I put thus a lot of time into trying to keep actual with what’s happening in the MS Fabric world, from Power BI to KQL, Python, dataflows, SQL databases and much more. These technologies are Microsoft’s bet, though at least from German’s market perspective, all bets are off! Probably, many companies are circumspect or need more time to react to the political and economic impulses, or probably some companies are already in bad shape. 

    Unfortunately, the political context has a broad impact on the economy, on what’s happening in the job market right now! However, the two aspects are not the only problem. Between candidates and jobs, the distance seems to grow, a dense wall of opinion being built, multiple layers based on presumptions filtering out voices that (still) matter! Does my experience matter or does it become obsolete like the technologies I used to work with? But I continued to learn, to keep actual… Or do I need to delete everything that reminds the old?

    To succeed or at least be hired today one must fit a pattern that frankly doesn’t make sense! Yes, soft skills are important though not all of them are capable of compensating for the lack of technical skills! There seems to be a tendency to exaggerate some of the qualities associated with skills, or better said, of hiding behind big words. Sometimes it feels like a Shakespearian inaccurate adaptation of the stage on which we are merely players.

    More likely, this lack of pragmatism will lead to suboptimal constructions that will tend to succumb under their own structure. All the inefficiencies need to be corrected, or somebody (or something) must be able to bear their weight. I saw this too often happening in ERP implementations! Big words don’t compensate for the lack of pragmatism, skills, knowledge, effort or management! For many organizations the answer to nowadays problems is more management, which occasionally might be the right approach, though this is not a universal solution for everything that crosses our path(s).

    One of society’s answers to nowadays’ problem seems to be the refuge in AI. So, I wonder – where I’m going now? Jobless, without an acceptable perspective, with AI penetrating the markets and making probably many jobs obsolete. One must adapt, but adapt to what? AI is brainless even if it can mimic intelligence! Probably, it can do more in time to the degree that many more jobs will become obsolete (and I’m wondering what will happen to all those people). 

    Conversely, to some trends there will be probably other trends against them, however it’s challenging to depict in clear terms the future yet in making. Society seems to be at a crossroad, more important than mine.

    Previous Post <<||>> Next Post

    29 April 2025

    🏭🗒️Microsoft Fabric: Purview [Notes]

    Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)! 

    Last updated: 29-Apr-2025

    [Microsoft Purview] Purview
    • {def} comprehensive data governance and security platform designed to help organizations manage, protect, and govern their data across various environments [1]
      • incl. on-premises, cloud & SaaS applications [1]
      • provides the highest and most flexible level of functionality for data governance in MF [1]
        • offers comprehensive tools for 
          • data discovery
          • data classification
          • data cataloging
    • {capability} managing the data estate
      • {tool} dedicated portal
        • aka Fabric Admin portal
        • used to control tenant settings, capacities, domains, and other objects, typically reserved for administrators
      • {type} logical containers
        • used to control access to data and capabilities [1]
        • {level} tenants
          • settings for Fabric administrators [1]
        • {level} domains
          • group data that is relevant to a single business area or subject field [1]
        • {level} workspaces 
          • group Fabric items used by a single team or department [1]
      • {type} capacities
        • objects that limit compute resource usage for all Fabric workloads [1]
    • {capability} metadata scanning
      • extracts values from data lakes
        • e.g. names, identities, sensitivities, endorsements, etc. 
        • can be used to analyze and set governance policies [1]
    • {capability} secure and protect data
      • assure that data is protected against unauthorized access and destructive attacks [1]
      • compliant with data storage regulations applicable in your region [1]
      • {tool} data tags
        • allows to identity the sensitivity of data and apply data retentions and protection policies [1]
      • {tool} workspace roles
        • define the users who are authorized to access the data in a workspace [1]
      • {tool} data-level controls
        • used at the level of Fabric items
          • e.g. tables, rows, and columns to impose granular restrictions.
      • {tool} certifications
        • Fabric is compliant with many data management certifications
          • incl. HIPAA BAA, ISO/IEC 27017, ISO/IEC 27018, ISO/IEC 27001, ISO/IEC 27701 [1]
    • {feature} OneLake data hub
      • allows users to find and explore the data in their estate.
    • {feature} endorsement
      • allows users to endorse a Fabric item to identity it as of high quality [1]
        • help other users to trust the data that the item contains [1]
    • {feature} data lineage
      • allows users to understand the flow of data between items in a workspace and the impact that a change would have [1]
    • {feature} monitoring hub
      • allows to monitor activities for the Fabric items for which the user has the permission to view [1]
    • {feature} capacity metrics
      • app used to monitor usage and consumption
    • {feature} allows to automate the identification of sensitive information and provides a centralized repository for metadata [1]
    • feature} allows to find, manage, and govern data across various environments
      • incl. both on-premises and cloud-based systems [1]
      • supports compliance and risk management with features that monitor regulatory adherence and assess data vulnerabilities [1]
    • {feature} integrated with other Microsoft services and third-party tools 
      • {benefit} enhances its utility
      • {benefit} streamlines data access controls
        • enforcing policies, and delivering insights into data lineage [1]
    • {benefit} helps organizations maintain data integrity, comply with regulations, and use their data effectively for strategic decision-making [1]
    • {feature} Data Catalog
      • {benefit} allows users to discover, understand, and manage their organization's data assets
        • search for and browse datasets
        • view metadata
        • gain insights into the data’s lineage, classification, and sensitivity labels [1]
      • {benefit} promotes collaboration
        • users can annotate datasets with tags to improve discoverability and data governance [1]
      • targets users and administrator
      • {benefit} allows to discover where patient records are held by searching for keywords [1]
      • {benefit} allows to label documents and items based on their sensitiveness [1]
      • {benefit} allows to use access policies to manage self-service access requests [1]
    • {feature} Information Protection
      • used to classify, label, and protect sensitive data throughout the organization [1]
        • by applying customizable sensitivity labels, users classify records. [1]
        • {concept} policies
          • define access controls and enforce encryption
          • labels follow the data wherever it goes
          • helps organizations meet compliance requirements while safeguarding data against accidental exposure or malicious threats [1]
      • allows to protect records with policies to encrypt data and impose IRM
    • {feature} Data Loss Prevention (DLP)
      • the practice of protecting sensitive data to reduce the risk from oversharing [2]
        • implemented by defining and applying DLP policies [2]
    • {feature} Audit
      • user activities are automatically logged and appear in the Purview audit log
        • e.g. creating files or accessing Fabric items
    • {feature} connect Purview to Fabric in a different tenant
      • all functionality is supported, except that 
        • {limitation} Purview's live view isn't available for Fabric items [1]
        • {limitation} the system can't identify user registration automatically [1]
        • {limitation} managed identity can’t be used for authentication in cross-tenant connections [1]
          • {workaround} use a service principal or delegated authentication [1]
    • {feature} Purview hub
      • displays reports and insights about Fabric items [1]
        • acts as a centralized location to begin data governance and access more advanced features [1]
        • via Settings >> Microsoft Purview hub
        • administrators see information about their entire organization's Fabric data estate
        • provides information about
          • Data Catalog
          • Information Protection
          • Audit
      • the data section displays tables and graphs that analyze the entire organization's items in MF
        • users only see information about their own Fabric items and data

    References:
    [1] Microsoft Learn (2024) Purview: Govern data in Microsoft Fabric with Purview[link]
    [2] Microsoft Learn (2024) Purview: Learn about data loss prevention [link]
    [3] Microsoft Learn (2024) [link]

    Resources:

    Acronyms:
    DLP - Data Loss Prevention
    M365 - Microsoft 365
    MF - Microsoft Fabric
    SaaS - Software-as-a-Service

    26 April 2025

    🏭🗒️Microsoft Fabric: Deployment Pipelines [Notes]

    Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)! 

    Last updated: 26-Apr-2025

    [Microsoft Fabric] Deployment Pipelines

    • {def} a structured process that enables content creators to manage the lifecycle of their organizational assets [5]
      • enable creators to develop and test content in the service before it reaches the users [5]
        • can simplify the deployment process to development, test, and production workspaces [5]
        • one Premium workspace is assigned to each stage [5]
        • each stage can have 
          • different configurations [5]
          • different databases or different query parameters [5]
    • {action} create pipeline
      • from the deployment pipelines entry point in Fabric [5]
        • creating a pipeline from a workspace automatically assigns it to the pipeline [5]
      • {action} define how many stages it should have and what they should be called [5]
        • {default} has three stages
          • e.g. Development, Test, and Production
          • the number of stages can be changed anywhere between 2-10 
          • {action} add another stage,
          • {action} delete stage
          • {action} rename stage 
            • by typing a new name in the box
          • {action} share a pipeline with others
            • users receive access to the pipeline and become pipeline admins [5]
          • ⇐ the number of stages are permanent [5]
            • can't be changed after the pipeline is created [5]
      • {action} add content to the pipeline [5]
        • done by assigning a workspace to the pipeline stage [5]
          • the workspace can be assigned to any stage [5]
      • {action|optional} make a stage public
        • {default} the final stage of the pipeline is made public
        • a consumer of a public stage without access to the pipeline sees it as a regular workspace [5]
          • without the stage name and deployment pipeline icon on the workspace page next to the workspace name [5]
      • {action} deploy to an empty stage
        • when finishing the work in one pipeline stage, the content can be deployed to the next stage [5] 
          • deployment can happen in any direction [5]
        • {option} full deployment 
          • deploy all content to the target stage [5]
        • {option} selective deployment 
          • allows select the content to deploy to the target stage [5]
        • {option} backward deployment 
          • deploy content from a later stage to an earlier stage in the pipeline [5] 
          • {restriction} only possible when the target stage is empty [5]
      • {action} deploy content between pages [5]
        • content can be deployed even if the next stage has content
          • paired items are overwritten [5]
      • {action|optional} create deployment rules
        • when deploying content between pipeline stages, allow changes to content while keeping some settings intact [5] 
        • once a rule is defined or changed, the content must be redeployed
          • the deployed content inherits the value defined in the deployment rule [5]
          • the value always applies as long as the rule is unchanged and valid [5]
      • {feature} deployment history 
        • allows to see the last time content was deployed to each stage [5]
        • allows to to track time between deployments [5]
    • {concept} pairing
      • {def} the process by which an item in one stage of the deployment pipeline is associated with the same item in the adjacent stage
        • applies to reports, dashboards, semantic models
        • paired items appear on the same line in the pipeline content list [5]
          • ⇐ items that aren't paired, appear on a line by themselves [5]
        • the items remain paired even if their name changes
        • items added after the workspace is assigned to a pipeline aren't automatically paired [5]
          • ⇐ one can have identical items in adjacent workspaces that aren't paired [5]
    • [lakehouse]
      • can be removed as a dependent object upon deployment [3]
      • supports mapping different Lakehouses within the deployment pipeline context [3]
      • {default} a new empty Lakehouse object with same name is created in the target workspace [3]
        • ⇐ if nothing is specified during deployment pipeline configuration
        • notebook and Spark job definitions are remapped to reference the new lakehouse object in the new workspace [3]
        • {warning} a new empty Lakehouse object with same name still is created in the target workspace [3]
        • SQL Analytics endpoints and semantic models are provisioned
        • no object inside the Lakehouse is overwritten [3]
        • updates to Lakehouse name can be synchronized across workspaces in a deployment pipeline context [3] 
    • [notebook] deployment rules can be used to customize the behavior of notebooks when deployed [4]
      • e.g. change notebook's default lakehouse [4]
      • {feature} auto-binding
        • binds the default lakehouse and attached environment within the same workspace when deploying to next stage [4]
    • [environment] custom pool is not supported in deployment pipeline
      • the configurations of Compute section in the destination environment are set with default values [6]
      • ⇐ subject to change in upcoming releases [6]
    • [warehouse]
      • [database project] ALTER TABLE to add a constraint or column
        • {limitation} the table will be dropped and recreated when deploying, resulting in data loss
      • {recommendation} do not create a Dataflow Gen2 with an output destination to the warehouse
        • ⇐ deployment would be blocked by a new item named DataflowsStagingWarehouse that appears in the deployment pipeline [10]
      • SQL analytics endpoint is not supported
    • [Eventhouse]
      • {limitation} the connection must be reconfigured in destination that use Direct Ingestion mode [8]
    • [EventStream]
      • {limitation} limited support for cross-workspace scenarios
        • {recommendation} make sure all EventStream destinations within the same workspace [8]
    • KQL database
      • applies to tables, functions, materialized views [7]
    • KQL queryset
      • ⇐ tabs, data sources [7]
    • [real-time dashboard]
      • data sources, parameters, base queries, tiles [7]
    • [SQL database]
      • includes the specific differences between the individual database objects in the development and test workspaces [9]
    • can be also used with

      References:
      [1] Microsoft Learn (2024) Get started with deployment pipelines [link]
      [2] Microsoft Learn (2024) Implement continuous integration and continuous delivery (CI/CD) in Microsoft Fabric [link]
      [3] Microsoft Learn (2024)  Lakehouse deployment pipelines and git integration (Preview) [link]
      [4] Microsoft Learn (2024) Notebook source control and deployment [link
      [5] Microsoft Learn (2024) Introduction to deployment pipelines [link]
      [6] Environment Git integration and deployment pipeline [link]
      [7] Microsoft Learn (2024) Microsoft Learn (2024) Real-Time Intelligence: Git integration and deployment pipelines (Preview) [link]
      [8] Microsoft Learn (2024) Eventstream CI/CD - Git Integration and Deployment Pipeline [link]
      [9] Microsoft Learn (2024) Get started with deployment pipelines integration with SQL database in Microsoft Fabric [link]
      [10] Microsoft Learn (2025) Source control with Warehouse (preview) [link

      Resources:

      Acronyms:
      CLM - Content Lifecycle Management
      UAT - User Acceptance Testing

      🏭🗒️Microsoft Fabric: Power BI Environments [Notes]

      Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)! 

      Last updated: 26-Apr-2025

      Enterprise Content Publishing [2]

      [Microsoft Fabric] Power BI Environments

      • {def} structured spaces within Microsoft Fabric that helps organizations manage the Power BI assets through the entire lifecycle
      • {environment} development 
        • allows to develop the solution
        • accessible only to the development team 
          • via Contributor access
        • {recommendation} use Power BI Desktop as local development environment
          • {benefit} allows to try, explore, and review updates to reports and datasets
            • once the work is done, upload the new version to the development stage
          • {benefit} enables collaborating and changing dashboards
          • {benefit} avoids duplication 
            • making online changes, downloading the .pbix file, and then uploading it again, creates reports and datasets duplication
        • {recommendation} use version control to keep the .pbix files up to date
          • [OneDrive] use Power BI's autosync
            • {alternative} SharePoint Online with folder synchronization
            • {alternative} GitHub and/or VSTS with local repository & folder synchronization
        • [enterprise scale deployments] 
          • {recommendation} separate dataset from reports and dashboards’ development
            • use the deployment pipelines selective deploy option [22]
            • create separate .pbix files for datasets and reports [22]
              • create a dataset .pbix file and uploaded it to the development stage (see shared datasets [22]
              • create .pbix only for the report, and connect it to the published dataset using a live connection [22]
            • {benefit} allows different creators to separately work on modeling and visualizations, and deploy them to production independently
          • {recommendation} separate data model from report and dashboard development
            • allows using advanced capabilities 
              • e.g. source control, merging diff changes, automated processes
            • separate the development from test data sources [1]
              • the development database should be relatively small [1]
        • {recommendation} use only a subset of the data [1]
          • ⇐ otherwise the data volume can slow down the development [1]
      • {environment} user acceptance testing (UAT)
        • test environment that within the deployment lifecycle sits between development and production
          • it's not necessary for all Power BI solutions [3]
          • allows to test the solution before deploying it into production
            • all tests must have 
              • View access for testing
              • Contributor access for report authoring
          • involves business users who are SMEs
            • provide approval that the content 
              • is accurate
              • meets requirements
              • can be deployed for wider consumption
        • {recommendation} check report’s load and the interactions to find out if changes impact performance [1]
        • {recommendation} monitor the load on the capacity to catch extreme loads before they reach production [1]
        • {recommendation} test data refresh in the Power BI service regularly during development [20]
      • {environment} production
        • {concept} staged deployment
          • {goal} help minimize risk, user disruption, or address other concerns [3]
            • the deployment involves a smaller group of pilot users who provide feedback [3]
        • {recommendation} set production deployment rules for data sources and parameters defined in the dataset [1]
          • allows ensuring the data in production is always connected and available to users [1]
        • {recommendation} don’t upload a new .pbix version directly to the production stage
          •  ⇐ without going through testing
      • {feature|preview} deployment pipelines 
        • enable creators to develop and test content in the service before it reaches the users [5]
      • {recommendation} build separate databases for development and testing 
        • helps protect production data [1]
      • {recommendation} make sure that the test and production environment have similar characteristics [1]
        • e.g. data volume, sage volume, similar capacity 
        • {warning} testing into production can make production unstable [1]
        • {recommendation} use Azure A capacities [22]
      • {recommendation} for formal projects, consider creating an environment for each phase
      • {recommendation} enable users to connect to published datasets to create their own reports
      • {recommendation} use parameters to store connection details 
        • e.g. instance names, database names
        • ⇐  deployment pipelines allow configuring parameter rules to set specific values for the development, test, and production stages
          • alternatively data source rules can be used to specify a connection string for a given dataset
            • {restriction} in deployment pipelines, this isn't supported for all data sources
      • {recommendation} keep the data in blob storage under the 50k blobs and 5GB data in total to prevent timeouts [29]
      • {recommendation} provide data to self-service authors from a centralized data warehouse [20]
        • allows to minimize the amount of work that self-service authors need to take on [20]
      • {recommendation} minimize the use of Excel, csv, and text files as sources when practical [20]
      • {recommendation} store source files in a central location accessible by all coauthors of the Power BI solution [20]
      • {recommendation} be aware of API connectivity issues and limits [20]
      • {recommendation} know how to support SaaS solutions from AppSource and expect further data integration requests [20]
      • {recommendation} minimize the query load on source systems [20]
        • use incremental refresh in Power BI for the dataset(s)
        • use a Power BI dataflow that extracts the data from the source on a schedule
        • reduce the dataset size by only extracting the needed amount of data 
      • {recommendation} expect data refresh operations to take some time [20]
      • {recommendation} use relational database sources when practical [20]
      • {recommendation} make the data easily accessible [20]
      • [knowledge area] knowledge transfer
        • {recommendation} maintain a list of best practices and review it regularly [24]
        • {recommendation} develop a training plan for the various types of users [24]
          • usability training for read only report/app users [24
          • self-service reporting for report authors & data analysts [24]
          • more elaborated training for advanced analysts & developers [24]
      • [knowledge area] lifecycle management
        • consists of the processes and practices used to handle content from its creation to its eventual retirement [6]
        • {recommendation} postfix files with 3-part version number in Development stage [24]
          • remove the version number when publishing files in UAT and production 
        • {recommendation} backup files for archive 
        • {recommendation} track version history 

        References:
        [1] Microsoft Learn (2021) Fabric: Deployment pipelines best practices [link]
        [2] Microsoft Learn (2024) Power BI: Power BI usage scenarios: Enterprise content publishing [link]
        [3] Microsoft Learn (2024) Deploy to Power BI [link]
        [4] Microsoft Learn (2024) Power BI implementation planning: Content lifecycle management [link]
        [5] Microsoft Learn (2024) Introduction to deployment pipelines [link]
        [6] Microsoft Learn (2024) Power BI implementation planning: Content lifecycle management [link]
        [20] Microsoft (2020) Planning a Power BI  Enterprise Deployment [White paper] [link]
        [22] Power BI Docs (2021) Create Power BI Embedded capacity in the Azure portal [link]
        [24] Paul Turley (2019)  A Best Practice Guide and Checklist for Power BI Projects

        Resources:

        Acronyms:
        API - Application Programming Interface
        CLM - Content Lifecycle Management
        COE - Center of Excellence
        SaaS - Software-as-a-Service
        SME - Subject Matter Expert
        UAT - User Acceptance Testing
        VSTS - Visual Studio Team System
        SME - Subject Matter Experts

        25 April 2025

        💫🗒️ERP Systems: Microsoft Dynamics 365's Business Process Catalog (BPC) [Notes]

        Disclaimer: This is work in progress intended to consolidate information from the various sources and not to provide a complete overview of all the features. Please refer to the documentation for a complete overview!

        Last updated: 25-Apr-2025

        Business Process Catalog - End-to-End Scenarios

        [Dynamics 365] Business Process Catalog (BPC)

        • {def} lists of end-to-end processes that are commonly used to manage or support work within an organization [1]
          • agnostic catalog of business processes contained within the entire D365 solution space [3]
            • {benefit} efficiency and time savings [3]
            • {benefit} best practices [3]
            • {benefit} reduced risk [3]
            • {benefit} technology alignment [3]
            • {benefit} scalability [3]
            • {benefit} cross-industry applicability [3]
          • stored in an Excel workbook
            • used to organize and prioritize the work on the business process documentation [1]
            • {recommendation} check the latest versions (see [R1])
          • assigns unique IDs to 
            • {concept} end-to-end scenario
              • describe in business terms 
                • not in terms of software technology
              • includes the high-level products and features that map to the process [3]
              • covers two or more business process areas
              • {purpose} map products and features to benefits that can be understood in business contexts [3]
            • {concept} business process areas
              • combination of business language and basic D365 terminology [3]
              • groups business processes for easier searching and navigation [1]
              • separated by major job functions or departments in an organization [1]
              • {purpose} map concepts to benefits that can be understood in business context [3]
              • more than 90 business process areas defined [1]
            • {concept} business processes
              • a series of structured activities and tasks that organizations use to achieve specific goals and objectives [3]
                • efficiency and productivity
                • consistency and quality
                • cost reduction
                • risk management
                • scalability
                • data-driven decision-making
              • a set of tasks in a sequence that is completed to achieve a specific objective [5]
                • define when each step is done in the implementation [5] [3]
                • define how many are needed [5] [3]
              • covers a wide range of structured, often sequenced, activities or tasks to achieve a predetermined organizational goal
              • can refer to the cumulative effects of all steps progressing toward a business goal
              • describes a function or process that D365 supports
                • more than 700 business processes identified
                • {goal} provide a single entry point with links to relevant product-specific content [1]
              • {concept} business process guide
                • provides documentation on the structure and patterns of the process along with guidance on how to use them in a process-oriented implementation [3]
                • based on a catalog of business process supported by D365 [3]
              • {concept} process steps 
                • represented sequentially, top to bottom
                  • can include hyperlinks to the product documentation [5] 
                  • {recommendation} avoid back and forth in the steps as much as possible [5]
                • can be
                  • forms used in D365 [5]
                  • steps completed in LCS, PPAC, Azure or other Microsoft products [5]
                  • steps that are done outside the system (incl. third-party system) [5]
                  • steps that are done manually [5]
                • are not 
                  • product documentation [5]
                  • a list of each click to perform a task [5]
              • {concept} process states
                • include
                  • project phase 
                    • e.g. strategize, initialize, develop, prepare, operate
                  • configuration 
                    • e.g. base, foundation, optional
                  • process type
                    • e.g. configuration, operational
            • {concept} patterns
              • repeatable configurations that support a specific business process [1]
                • specific way of setting up D365 to achieve an objective [1]
                • address specific challenges in implementations and are based on a specific scenario or best practice [6]
                • the solution is embedded into the application [6]
                • includes high-level process steps [6]
              • include the most common use cases, scenarios, and industries [1]
              • {goal} provide a baseline for implementations
                • more than 2000 patterns, and we expect that number to grow significantly over time [1]
              • {activity} naming a new pattern
                • starts with a verb
                • describes a process
                • includes product names
                • indicate the industry
                • indicate AppSource products
            • {concept} reference architecture 
              • acts as a core architecture with a common solution that applies to many scenarios [6]
              • typically used for integrations to external solutions [6]
              • must include an architecture diagram [6]
          • {concept} process governance
            • {benefit} improved quality
            • {benefit} enhanced decision making
            • {benefit} agility adaptability
            • {benefit{ Sbd alignment
            • {goal} enhance efficiency 
            • {goal} ensure compliance 
            • {goal} facilitate accountability 
            • {concept} policy
            • {concept} procedure
            • {concept} control
          • {concept} scope definition
            • {recommendation} avoid replicating current processes without considering future needs [4]
              • {risk} replicating processes in the new system without re-evaluating and optimizing [4] 
              • {impact} missed opportunities for process improvement [4]
            • {recommendation} align processes with overarching business goals rather than the limitations of the current system [4]
          • {concept} guidance hub
            • a central landing spot for D365 guidance and tools
            • contains cross-application documentations
        • {purpose} provide considerations and best practices for implementation [6]
        • {purpose} provide technical information for implementation [6]
        • {purpose} provide link to product documentation to achieve the tasks in scope [6]
        Previous Post <<||>> Next Post 

        References:
        [1] Microsoft Learn (2024) Dynamics 365: Overview of end-to-end scenarios and business processes in Dynamics 365 [link]
        [2] Microsoft Dynamics 365 Community (2023) Business Process Guides - Business Process Guides [link]
        [3] Microsoft Dynamics 365 Community (2024) Business Process Catalog and Guidance - Part 2 Introduction to Business Processes [link]
        [4] Microsoft Dynamics 365 Community (2024) Business Process Catalog and Guidance - Part 3: Using the Business Process Catalog to Manage Project Scope and Estimation [link]
        [5] Microsoft Dynamics 365 Community (2024) Business Process Catalog and Guidance - Part 4: Authoring Business Processes [link]
        [6] Microsoft Dynamics 365 Community (2024) Business Process Catalog and Guidance - Part 5:  Authoring Business Processes Patterns and Use Cases [link]
        [7] Microsoft Dynamics 365 Community (2024) Business Process Catalog and Guidance  - Part 6: Conducting Process-Centric Discovery [link]
        [8] Microsoft Dynamics 365 Community (2024) Business Process Catalog and Guidance  - Part 7: Introduction to Process Governance [link]

        Resources:
        [R1] GitHub (2024) Business Process Catalog [link]
        [R2] Microsoft Learn (2024) Dynamics 365 guidance documentation and other resources [link]
        [R3] Dynamics 365 Blog (2025) Process, meet product: The business process catalog for Dynamics 365 [link]

        Acronyms:
        3T - Tools, Techniques, Tips
        ADO - 
        BPC - Business Process Catalog
        D365 - Dynamics 365
        LCS - Lifecycle Services
        PPAC - Power Platform admin center
        RFI - Request for Information
        RFP - Request for Proposal
        Related Posts Plugin for WordPress, Blogger...

        About Me

        My photo
        Koeln, NRW, Germany
        IT Professional with more than 25 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.