Showing posts with label limitations. Show all posts
Showing posts with label limitations. Show all posts

30 May 2025

🤖〽️Prompt Engineering: Copilot Unabridged (Part 6: Why There Are No Limits to AI Growth)

Prompt Engineering Series
Prompt Engineering Series

Prompt: "write a blogpost of 600 words on why there are no limits to growth for AI"

Introduction

Artificial Intelligence (AI) has transcended its early stages of development and is now an integral part of business, science, and daily life. Unlike traditional technologies that have clear limits on scalability, AI operates under a unique framework - it evolves, learns, and refines itself over time. This adaptability ensures that AI's growth remains exponential, breaking past barriers that once seemed insurmountable.

AI’s Ability to Learn and Evolve

AI differs from other technologies because it isn’t bound by static programming - it learns and improves through machine learning (ML) and deep learning. These processes enable AI to:

  • Optimize itself over time, making operations more efficient.
  • Develop new patterns that even human researchers may not anticipate.
  • Enhance accuracy and predictive capabilities across industries.

Unlike traditional software, which requires manual updates and modifications, AI continuously refines itself based on real-world interactions. This ability ensures perpetual growth, keeping AI ahead of conventional limitations.

Computational Power Is Increasing Rapidly

One of the biggest factors enabling AI’s endless growth is the exponential increase in computational power. Advances in quantum computing, cloud-based GPUs, and edge computing allow AI models to process massive datasets faster than ever. The more computing resources AI has access to, the more complex problems it can solve.

  • Cloud computing democratizes AI, making it accessible to businesses worldwide.
  • Quantum computing will eventually enable AI to process unimaginable volumes of data, far surpassing current limitations.
  • Neuromorphic computing mimics the human brain, allowing AI models to improve efficiency while consuming less energy.

As technology progresses, AI's capabilities will continue expanding - faster, smarter, and more adaptable than ever before.

AI Has Infinite Applications Across Industries

AI's impact isn’t confined to a single sector; its growth is fueled by its versatility. From healthcare to finance, retail to space exploration, AI finds new ways to integrate into human progress.

  • In medicine, AI assists in diagnosing diseases, customizing treatments, and even predicting outbreaks.
  • In business, AI automates operations, reduces inefficiencies, and enhances data-driven decision-making.
  • In research, AI enables discoveries in fields like genetics, material science, and astrophysics.
  • In entertainment, AI creates realistic CGI, automates music composition, and personalizes media experiences.

Since AI isn’t restricted to one field, its expansion is self-perpetuating - each industry unlocks new AI-driven possibilities, further fueling innovation.

AI Growth Is Driven by Human Curiosity

AI’s evolution isn’t happening in isolation; human ingenuity fuels its progress. Researchers, engineers, and businesses continuously explore new ways to push AI beyond current capabilities.

  • Developers refine AI models to improve accuracy and efficiency.
  • Businesses invest billions in AI research, ensuring advancements continue.
  • Governments and educational institutions integrate AI into policies and learning frameworks to accelerate adoption.

As long as humans remain curious about solving problems, enhancing intelligence, and optimizing processes, AI will continue to grow indefinitely.

Ethical Challenges Won’t Halt AI Growth

While ethical concerns - such as AI bias, surveillance, and job displacement - must be addressed, they won’t stop AI’s progress. Instead, they shape its development, ensuring responsible AI integration.

  • Regulations will guide ethical AI use rather than stifle growth.
  • Bias reduction strategies will make AI more inclusive while improving accuracy.
  • AI augmentation will reshape workforces, enabling humans to focus on creative and strategic tasks.

Rather than presenting limitations, ethical considerations ensure that AI’s limitless growth remains aligned with human values and societal needs.

Conclusion: AI’s Infinite Expansion

Unlike previous technological revolutions that reached plateaus, AI’s adaptive nature, increasing computational power, and infinite applications ensure continued growth. With every breakthrough in machine learning, automation, and intelligent computing, AI inches closer to solving problems previously deemed impossible.

The future of AI is limitless - and as long as humans continue pushing boundaries, AI will evolve indefinitely.

Disclaimer: The whole text was generated by Copilot at the first attempt. This is just an experiment to evaluate feature's ability to answer standard general questions, independently on whether they are correctly or incorrectly posed. Moreover, the answers may reflect hallucinations and other types of inconsistent or incorrect reasoning.

Previous Post <<||>> Next Post

29 May 2025

🤖〽️Prompt Engineering: Copilot Unabridged (Part 5: Why There Are Limits to AI Growth)

Prompt Engineering Series

Prompt: "write a blogpost of 600 words on why there are limits to growth for AI"

Introduction

Artificial Intelligence (AI) is evolving rapidly, transforming industries and redefining how humans interact with technology. However, despite its advancements, AI is not without limitations. While many claim that AI’s potential is limitless, several fundamental constraints affect its development, usability, and ethical application. In this blog post, we explore why AI growth is not boundless and the factors that restrict its expansion.

Computational and Hardware Constraints

AI depends heavily on computational power to process large amounts of data efficiently. While improvements in processing speeds, cloud computing, and quantum computing are promising, hardware limitations remain a bottleneck.

  • Energy consumption: Training sophisticated AI models requires immense energy, leading to high costs and environmental concerns.
  • Hardware limitations: Even cutting-edge processors (GPUs, TPUs) struggle with increasingly complex AI models, limiting performance scalability.
  • Data storage needs: AI requires vast amounts of storage and memory, which can be difficult to scale beyond current technologies.

Despite ongoing advancements, these factors impose real-world limitations on AI expansion, preventing it from growing indefinitely.

Dependence on Data and Human Oversight

AI is fundamentally data-driven, meaning its accuracy and capabilities are only as good as the data it receives. However, data availability and quality present major challenges:

  • Data scarcity: Not all industries generate sufficient high-quality datasets, limiting AI effectiveness.
  • Bias in data: AI models can inherit human biases, leading to ethical concerns and flawed decision-making.
  • Human intervention: AI requires constant supervision for algorithm adjustments, data corrections, and error management.

Since AI models learn from existing human-created data, they are restricted by data inconsistencies, biases, and inaccuracies - creating natural growth barriers.

Ethical and Legal Restrictions

AI operates within a world governed by laws, regulations, and ethical constraints, which significantly limit its unrestricted development:

  • Privacy concerns: AI depends on personal data for optimization, but privacy regulations (GDPR, CCPA) restrict unchecked data usage.
  • Security risks: AI-driven automation increases the risk of hacking, misinformation, and deepfake fraud, requiring strict oversight.
  • Job displacement worries: Governments and institutions limit AI adoption to protect workers and avoid societal disruption.

As AI continues to advance, legal frameworks must adapt - but regulatory boundaries will always exist to prevent ethical breaches.

AI’s Lack of True Consciousness

Unlike humans, AI does not possess consciousness, intuition, or creativity in the way people do. While AI mimics intelligence, it lacks genuine understanding:

  • No self-awareness: AI does not truly “think”; it only processes patterns based on data inputs.
  • Limited creativity: AI can generate art and music, but it does not comprehend emotion, cultural context, or abstract thought.
  • Inability to make ethical judgments: AI lacks moral reasoning and cannot replace human values or ethical decision-making.

This inherent lack of cognitive ability prevents AI from achieving true autonomy, reinforcing its growth limitations.

Final Thoughts: AI’s Evolution Has Limits

While AI continues to expand, it is not limitless due to constraints in hardware, data, ethics, and intelligence itself.

AI can optimize processes, enhance decision-making, and revolutionize industries, but it will always require human oversight and face technical and ethical barriers.

Rather than seeing AI as boundless, we should focus on responsible development, ensuring it remains a powerful tool rather than an unchecked force.

Disclaimer: The whole text was generated by Copilot at the first attempt. This is just an experiment to evaluate feature's ability to answer standard general questions, independently on whether they are correctly or incorrectly posed. Moreover, the answers may reflect hallucinations and other types of inconsistent or incorrect reasoning.

Previous Post <<||>> Next Post

26 April 2025

🏭🗒️Microsoft Fabric: Parameters in Dataflows Gen2 [Notes] 🆕

Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)! 

Last updated: 26-Apr-2

[Microsoft Fabric] Dataflow Gen2 Parameters

  • {def} parameters that allow to dynamically control and customize Dataflows Gen2
    • makes them more flexible and reusable by enabling different inputs and scenarios without modifying the dataflow itself [1]
    • the dataflow is refreshed by passing parameter values outside of the Power Query editor through either
      • Fabric REST API [1]
      • native Fabric experiences [1]
    • parameter names are case sensitive [1]
    • {type} required parameters
      • {warning} the refresh fails if no value is passed for it [1]
    • {type} optional parameters
    • enabled via Parameters >> Enable parameters to be discovered and override for execution [1]
  • {limitation} dataflows with parameters can't be
    • scheduled for refresh through the Fabric scheduler [1]
    • manually triggered through the Fabric Workspace list or lineage view [1]
  • {limitation} parameters that affect the resource path of a data source or a destination are not supported [1]
    • ⇐ connections are linked to the exact data source path defined in the authored dataflow
      • can't be currently override to use other connections or resource paths [1]
  • {limitation} can't be leveraged by dataflows with incremental refresh [1]
  • {limitation} supports only parameters of the type decimal number, whole number, text and true/false can be passed for override
    • any other data types don't produce a refresh request in the refresh history but show in the monitoring hub [1]
  • {warning} allow other users who have permissions to the dataflow to refresh the data with other values [1]
  • {limitation} refresh history does not display information about the parameters passed during the invocation of the dataflow [1]
  • {limitation} monitoring hub doesn't display information about the parameters passed during the invocation of the dataflow [1]
  • {limitation} staged queries only keep the last data refresh of a dataflow stored in the Staging Lakehouse [1]
  • {limitation} only the first request will be accepted from duplicated requests for the same parameter values [1]
    • subsequent requests are rejected until the first request finishes its evaluation [1]

References:
[1] Microsoft Learn (2025) Use public parameters in Dataflow Gen2 (Preview) [link

Resources:
[R1] Microsoft Fabric Blog (2025) Passing parameter values to refresh a Dataflow Gen2 (Preview) [link

Acronyms:
API - Application Programming Interface
REST - Representational State Transfer

13 March 2025

🏭🗒️Microsoft Fabric: Workspaces [Notes]

Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)! 

Last updated: 25-Mar-2025

[Microsoft Fabric] Workspace

  • {def} a collection of items that brings together different functionality in a single environment designed for collaboration
  • {default} created in organization's shared capacity
    • workspaces can be assigned to other capacities
      • includes My Workspaces
      • via Workspace settings >> Premium
  • components
    • header
      • contains
        • name 
        • brief description of the workspace
        • links to other functionality
    • toolbar 
      • contains 
        • controls for managing items to the workspace 
        • controls for managing files
    • view area
      • enables selecting a view
      • {type} list view
        • {subitem} task flow
          • area in which users can create or view a graphical representation of the data project [3]
            • ⇐ shows the logical flow of the project [3]
              • ⇐ it doesn't show the flow of data [3]
          • can be hided via Show/Hide arrows
        • {subitem} items list
          • area in which the users can see the items and folders in the workspace [3]
          • one can filter the items list by selecting the tasks, if any defined [3]
        • {subitem} resize bar
          • elements that allow to resize the task flow and items list by dragging the resize bar up or down [3]
      • {type} lineage view
        • shows the flow of data between the items in the workspace [3]
  • {feature} workspace settings 
    • allows to manage and update the workspace [3]
  • {feature} contact list 
    • allows to specify which users receive notification about issues occurring in the workspace [3] 
    • {default} contains workspace's creator [3]
  • {feature} SharePoint integration 
    • allows to configure a M365 Group whose SharePoint document library is available to workspace users [3]
      • ⇐ the group is created outside of MF first [3]
      • restrictions may apply to the environment
    • {best practice} give access to the workspace to the same M365 Group whose file storage is configured [3]
      • MF doesn't synchronize permissions between users or groups with workspace access, and users or groups with M365 Group membership [3]
  • {feature} workspace identity
    • an automatically managed service principal that can be associated with a Fabric workspace [6]
      • workspaces with a workspace identity can securely read or write to firewall-enabled ADSL Gen2 accounts through trusted workspace access for OneLake shortcuts [6]
      • Fabric creates a service principal in Microsoft Entra ID to represent the identity [6]
        • ⇐ an accompanying app registration is also created [6]
        • Fabric automatically manages the credentials associated with workspace identities [6]
          • ⇒ prevents credential leaks and downtime due to improper credential handling [6]
    • used to obtain Microsoft Entra tokens without the customer having to manage any credentials [6]
      • Fabric items can use the identity when connecting to resources that support Microsoft Entra authentication [6]
    • can be created in the workspace settings of any workspace except My workspaces
    • automatically assigned to the workspace contributor role and has access to workspace items [6]
  • {feature} workspace roles
    • allows to manage who can do what in a workspace [4]
    • sit on top of OneLake and divide the data lake into separate containers that can be secured independently [4]
    • extend the Power BI workspace roles by associating new MF capabilities 
      • e.g. data integration, data exploration
    • can be assigned to 
      • individual users
      • security groups
      • Microsoft 365 groups
      • distribution lists
    • {role} Admin
    • {role} Member
    • {role} Contributor
    • {role} Viewer
    • user groups
      • members get the role(s) assigned
      • users existing in several group get the highest level of permission that's provided by the roles that they're assigned [4]
      • {concept} [nested group]
  • {concept} current workspace
    • the active open workspace
  • {action} create new workspace
  • {action} pin workspace
  • {action} delete workspace
    • everything contained within the workspace is deleted for all group members [3]
      • the associated app is also removed from AppSource [3]
    • {warning} if the workspace has a workspace identity, that workspace identity will be irretrievably lost [3]
      • this may cause Fabric items relying on the workspace identity for trusted workspace access or authentication to break [3]
    • only admins can perform the operation
  • {action} manage workspace
  • {action} take ownership of Fabric items
    • Fabric items may stop working correctly [5]
      • {scenario} the owner leaves the organization [5]
      • {scenario}the owner don't sign in for more than 90 days [5]
      • in such cases, anyone with read and write permissions on an item can take ownership of the item [5]
        • become the owner of any child items the item might have
        • {limitation} one can't take over ownership of child items directly [5]
          • ⇐ one can take ownership only through the parent item [5]
  • {limitation} workspace retention
    • personal workspace
      • {default} 30 days [8]
    • collaborative workspace
      • {default} 7 days [8]
      • configurable between 7 to 90 days [8]
        • via Define workspace retention period setting [8]
    •  during the retention period, Fabric administrators can restore, respectively delete permanently the workspace [8]
      • after that, the workspace is deleted permanently and it and its contents are irretrievably lost [8]
  • {limitation} can contain a maximum of 1000 items [8]
    • includes both parent and child items [8]
      • refers to Fabric and Power BI items [8]
    • workspaces with more items have 180-day extension period applied automatically as of 10-Apr-2025 [8]
  • {limitation} certain special characters aren't supported in workspace names when using an XMLA endpoint [3]
  • {limitation} a user or a service principal can be a member of up to 1000 workspaces [3]
  • {feature} auditing
    • several activities are audited for workspaces [3]
      • CreateFolder
      • DeleteFolder
      • UpdateFolder
      • UpdateFolderAccess
  • {feature} workspace monitoring 
    • Eventhouse secure read-only database that collects and organizes logs and metrics from a range of Fabric items in the workspace [1]
      • accessible only to workspace users with at least a contributor role [1]
      • users can access and analyze logs and metrics [1]
      • the data is aggregated or detailed [1]
      • can be queried via KQL or SQL [1]
      • supports both historical log analysis and real-time data streaming [1]
      • accessible from the workspace [1]
        • one can build and save query sets and dashboards to simplify data exploration [1]
      • use the workspace settings to delete the database [1]
        •  wait about 15 minutes before recreating a deleted database [1]
    • {action} share the database
      • users need workspace member or admin role [1]
    • {limitation} one can enable either 
      • workspace monitoring 
      • log analytics
        • if enabled, the log analytics configuration must be deleted first before enabling workspace monitoring [1]
          • one should wait for a few hours before enabling workspace monitoring [1]
    • {limitation} retention period for monitoring data: 30 days [1]
    • {limitation}the ingestion can't be configured to filter for specific log type or category [1]
      • e.g. error or workload type.
    • {limitation} user data operation logs aren't available even though the table is available in the monitoring database [1]
    • {prerequisite} Power BI Premium or Fabric capacity [1]
    • {prerequisite} workspace admins can turn on monitoring for their workspaces tenant setting is enabled [1]
      • enabling the setting, requires Fabric administrator rights [1]
    • {prerequisite} admin role in the workspace [1]
  • workspace permissions
    •  the first security boundary for data within OneLake [7]
      • each workspace represents a single domain or project area where teams can collaborate on data [7]
      • security is managed through Fabric workspace roles [7]
    • items can have permissions configured separately from the workspace roles [7]
      • permissions can be configured either by [7]
        • sharing an item 
        • managing the permissions of an item

References:
[1] Microsoft Learn (2024) Fabric: What is workspace monitoring (preview)? [link]
[2] Microsoft Fabric Update Blog (2024) Announcing preview of Workspace Monitoring? [link]
[3] Microsoft Learn (2024) Fabric: Workspaces in Microsoft Fabric and Power BI [link]
[4] Microsoft Learn (2024) Fabric: Roles in workspaces in Microsoft Fabric [link]
[5] Microsoft Learn (2024) Fabric: Take ownership of Fabric items [link]
[6] Microsoft Learn (2024) Fabric: Workspace identity [link]
[7] Microsoft Learn (2024) Fabric: Role-based access control (RBAC) [link]
[8] Microsoft Learn (2024) Fabric: Manage workspaces [link]

Resources:
[R1] Microsoft Learn (2025) Fabric: What's new in Microsoft Fabric? [link]

Acronyms:
ADSL Gen2 - Azure Data Lake Storage Gen2
KQL - Kusto Query Language
M365 - Microsoft 365
MF - Microsoft Fabric
SQL - Structured Query Language

03 March 2025

💎🤖SQL Reloaded: Copilot Stories (Part VIII: Recursive Functions)

In theory, recursive functions prove useful in generating a list of numbers based on a formula in which the final result is based on the result of the previous step. In other words, a problem is solved by calculating the previous iteration of the same problem. In practice, they are limited by the number of iterations allowed, and there is a hard limit in SQL Server, the available resources (usually memory), and probably a few other factors. Of course, some algorithms can be implemented to overcome the limitations, while other times there are quite simple linear implementations based on a loop, like in the below examples. Therefore, recursive functions are more of a theoretical concept.   

Disclaimer: The below code was generated by Copilot via prompts and is used for demonstration purposes!

This post evaluates whether Microsoft 365 and Edge Copilot can be used to generate the code for some recursive functions. 

Case 1: Factorials

Let's start with the factorial that calculates the product of the n consecutive numbers.

Prompt:
Generate a recursive function in SQL Server to calculate the factorial of a number.

Output (M365 Copilot):

-- factorial function in Microsoft 365 Copilot
CREATE FUNCTION dbo.Factorial (@n INT)
RETURNS BIGINT
AS
BEGIN
    IF @n <= 1
        RETURN 1
    ELSE
        RETURN @n * dbo.Factorial(@n - 1)
END

Output (Edge Copilot):

-- factorial function in Edge Copilot
CREATE FUNCTION dbo.Factorial (@n INT)
RETURNS BIGINT
AS
BEGIN
    IF @n = 0 OR @n = 1
        RETURN 1
    ELSE
        RETURN @n * dbo.Factorial(@n - 1)
END;

The structure seems correct, and the returning value was identified correctly as bigint, which allows the maximum value in working with integer values. Unfortunately, both functions lead to the same error in a SQL database:

"Msg 455, Level 16, State 2, Line 9, The last statement included within a function must be a return statement."

The formulas ca be consolidated in a single statement as follows:

-- factorial function adapted (minimal)
CREATE FUNCTION dbo.Factorial (@n INT)
RETURNS BIGINT
AS
BEGIN
    RETURN 
        CASE 
            WHEN @n = 0 OR @n = 1 THEN 1
        ELSE
            @n * dbo.Factorial(@n - 1) END
END;

Especially when formulas become complex, it might be a good idea to store the result into a variable and use it in the RETURN statement.

-- factorial function adapted
CREATE FUNCTION dbo.Factorial (@n INT)
RETURNS BIGINT
AS
BEGIN
    DECLARE @retval as bigint;

    IF @n = 0 OR @n = 1
        SET @retval = 1
    ELSE
        SET @retval = @n * dbo.Factorial(@n - 1)

    RETURN @retval
END;

And, here are some tests for the function:

-- testing the functions for multiple values
SELECT dbo.Factorial (0) -- returns 1
SELECT dbo.Factorial (1) -- returns 1
SELECT dbo.Factorial (2) -- returns 2
SELECT dbo.Factorial (3) -- returns 6
SELECT dbo.Factorial (4) -- returns 24
SELECT dbo.Factorial (5) -- returns 120
SELECT dbo.Factorial (20) -- returns 2432902008176640000
SELECT dbo.Factorial (21) -- returns an error for all bigger values

As can be seen, the function will return an error for values bigger than 20, output limited by the data type used. One can build further constraints to test for the maximum value.

Use the following code to drop the function as needed:

-- dropping the function
DROP FUNCTION IF EXISTS dbo.Factorial 

Function's implementation has however another important caveat. If negative numbers are provided as parameters, the following error message is thrown:

"Msg 217, Level 16, State 1, Line 1, Maximum stored procedure, function, trigger, or view nesting level exceeded (limit 32)."

The final function can be defined as:

-- factorial function adapted (final)
CREATE FUNCTION dbo.Factorial (@n INT)
RETURNS BIGINT
AS
BEGIN
    DECLARE @retval as bigint;

    IF @n BETWEEN 0 AND 20
        IF @n = 0 OR @n = 1
            SET @retval = 1
        ELSE
            SET @retval = @n * dbo.Factorial(@n - 1)

    RETURN @retval
END;

One can linearize the function as a loop, though the limitation with the maximum value still exists:

Prompt:
Rewrite the factorial function using a loop.

Output (M365 Copilot):

-- non-recursive function to calculate the factorial of a number using a loop:
CREATE FUNCTION dbo.Factorial (@n INT)
RETURNS BIGINT
AS
BEGIN
    DECLARE @result BIGINT = 1;
    DECLARE @i INT = 1;

    WHILE @i <= @n
    BEGIN
        SET @result = @result * @i;
        SET @i = @i + 1;
    END

    RETURN @result;
END

-- test the code
SELECT dbo.Factorial(5) AS FactorialResult; -- result 120

One can obtain some improvement by changing the return type and @retsult from bigint to float, though the results are always approximations. Here are the approximated results:

-- test code
SELECT dbo.Factorial(20) AS FactorialResult; -- result 2432902008176640000
SELECT dbo.Factorial(21) AS FactorialResult; -- result 5.109094217170944E+19
SELECT dbo.Factorial(22) AS FactorialResult; -- result 1.1240007277776077E+21
SELECT dbo.Factorial(23) AS FactorialResult; -- result 2.585201673888498E+22
SELECT dbo.Factorial(24) AS FactorialResult; -- result 6.204484017332394E+23
SELECT dbo.Factorial(25) AS FactorialResult; -- result 1.5511210043330986E+25
SELECT dbo.Factorial(26) AS FactorialResult; -- result 4.0329146112660565E+26
SELECT dbo.Factorial(27) AS FactorialResult; -- result 1.0888869450418352E+28
SELECT dbo.Factorial(28) AS FactorialResult; -- result 3.0488834461171384E+29
SELECT dbo.Factorial(29) AS FactorialResult; -- result 8.841761993739701E+30
SELECT dbo.Factorial(30) AS FactorialResult; -- result 2.6525285981219103E+32
SELECT dbo.Factorial(31) AS FactorialResult; -- result 8.222838654177922E+33
SELECT dbo.Factorial(32) AS FactorialResult; -- result 2.631308369336935E+35
SELECT dbo.Factorial(170) AS FactorialResult; -- result 7.257415615307994E+306
SELECT dbo.Factorial(171) AS FactorialResult; -- results in arithmetic overflow

Another structure that allows creating recursive functions are the common table expressions (CTEs).

Prompt:
Create a recursive CTE in SQL Server for calculating n factorial.

Output (M365 Copilot):

-- n factorial as CTE
WITH FactorialCTE AS (
    -- Anchor member: factorial of 0 is 1
    SELECT 0 AS n, 1 AS factorial
    UNION ALL
    -- Recursive member: factorial of n is n * factorial of (n-1)
    SELECT n + 1, (n + 1) * factorial
    FROM FactorialCTE
    WHERE n < 10 -- Change this value to calculate factorial for larger numbers
)
SELECT n, factorial
FROM FactorialCTE;

One can cast the value in the anchor member as bigint or float as above with all the constraints resulting from this:

-- n factorial as CTE
WITH FactorialCTE AS (
    -- Anchor member: factorial of 0 is 1
    SELECT 0 AS n, Cast(1 as bigint) AS factorial
    UNION ALL
    -- Recursive member: factorial of n is n * factorial of (n-1)
    SELECT n + 1, (n + 1) * factorial
    FROM FactorialCTE
    WHERE n < 20 -- Change this value to calculate factorial for larger numbers
)
SELECT n, factorial
FROM FactorialCTE;

Case 2: Approximation of π

Power laws can be used to approximate the value of π and other irrational constants. Writing such a function via recursive functions has the same constraint as above, though there are alternatives (including storing the actual value as a constant). 

Prompt:
Generate a recursive function in SQL Server to approximate the value of π using power laws.

Output (M365 Copilot):

-- approximate the value of π using this series:
CREATE FUNCTION dbo.ApproximatePi (@n INT)
RETURNS FLOAT
AS
BEGIN
    IF @n = 0
        RETURN 4.0
    ELSE
        RETURN 4.0 * POWER(-1, @n) / (2 * @n + 1) + dbo.ApproximatePi(@n - 1)
END

-- test the function
SELECT dbo.ApproximatePi(1000) AS PiApproximation;

Function's definition has the same issue as the previous example. It's needed to rewrite the function as a single return:

-- approximate the value of π using this series (rewritten)
CREATE FUNCTION dbo.ApproximatePi (@n INT)
RETURNS FLOAT
AS
BEGIN
    RETURN
        CASE 
            WHEN @n = 0 THEN 4.0
            ELSE
                4.0 * POWER(-1, @n) / (2 * @n + 1) + dbo.ApproximatePi(@n - 1)
        END
END

-- testing the function
SELECT dbo.ApproximatePi (0) -- output: 4
SELECT dbo.ApproximatePi (1) -- output: 2.666666666667
SELECT dbo.ApproximatePi (2) -- output: 3.466666666667
SELECT dbo.ApproximatePi (3) -- output: 2.8952380952389998
SELECT dbo.ApproximatePi (10) -- output: 3.232315809407
SELECT dbo.ApproximatePi (20) -- output: 3.189184782279
SELECT dbo.ApproximatePi (30) -- output: 3.1738423371919993
SELECT dbo.ApproximatePi (31) -- output: 3.110350273699999
SELECT dbo.ApproximatePi (32) -- output: 3.171888735237999
SELECT dbo.ApproximatePi (33) -- throws error

After 32 iterations the call throws the same error as above. Fortunately, such functions can be (re)written using non-recursive implementations, which is the proper way to implementation such functions:

Prompt:
Implement the function without using recursive functions.

Output (M365 Copilot):

-- approximation within loop
CREATE FUNCTION dbo.ApproximatePi (@terms INT)
RETURNS FLOAT
AS
BEGIN
    DECLARE @pi FLOAT = 0.0;
    DECLARE @i INT = 0;

    WHILE @i < @terms
    BEGIN
        SET @pi = @pi + 4.0 * POWER(-1, @i) / (2 * @i + 1);
        SET @i = @i + 1;
    END

    RETURN @pi;
END 

This implementation is more flexible and a similar approach can be used for similar series. And, here are the results for various parameters:

-- testing the function
SELECT dbo.ApproximatePi(1000) AS PiApproximation; -- result: 3.140592653840002
SELECT dbo.ApproximatePi(10000) AS PiApproximation; -- result: 3.141492653571029
SELECT dbo.ApproximatePi(100000) AS PiApproximation; -- result: 3.141582653466945
SELECT dbo.ApproximatePi(1000000) AS PiApproximation; -- result: 3.1415916535590713

The last call is close, but probably not close enough, especially when one needs to use the function in multiple places. There are more complex power series which provide better performance faster. The direct use of the pi() function is usually recommended:

-- using the pi function
SELECT pi(); -- 3.141592653589793

Moreover, some series can be reduced to formulas based on constants like π, e or Φ.

And, a final piece of code:

-- drop the function
DROP FUNCTION IF EXISTS dbo.ApproximatePi

One can use a CTE also in this case:

Prompt:
Generate a recursive function in SQL Server to approximate the value of π using power laws.

Output (M365 Copilot):

-- series via recursive CTE
WITH PiApproximationCTE AS (
    -- Anchor member: start with k = 0
    SELECT 0 AS k, CAST(4.0 AS FLOAT) AS term, CAST(4.0 AS FLOAT) AS pi_approx
    UNION ALL
    -- Recursive member: calculate the next term and add it to the approximation
    SELECT 
        k + 1,
        CAST(4.0 * POWER(-1, k + 1) / (2 * (k + 1) + 1) AS FLOAT),
        pi_approx + CAST(4.0 * POWER(-1, k + 1) / (2 * (k + 1) + 1) AS FLOAT)
    FROM PiApproximationCTE
    WHERE k < 1000 -- Change this value to increase the number of terms
)
SELECT k, pi_approx
FROM PiApproximationCTE
OPTION (MAXRECURSION 0); -- Allow for more recursion levels if needed

This time Copilot generated a different faction for calculating the same. (The output is too long to be added in here as one record is generated  for each iteration.)

Happy coding!

Previous Post <<||>> Next Post

20 January 2025

🏭🗒️Microsoft Fabric: [Azure] Service Principals (SPN) [Notes]

Disclaimer: This is work in progress intended to consolidate information from various sources for learning purposes. For the latest information please consult the documentation (see the links below)! 

Last updated: 20-Jan-2025

[Azure] Service Principal (SPN)  

  • {def} a non-human, application-based security identity used by applications or automation tools to access specific Azure resources [1]
    • can be assigned precise permissions, making them perfect for automated processes or background services
      • allows to minimize the risks of human error and identity-based vulnerabilities
      • supported in datasets, Gen1/Gen2 dataflows, datamarts [2]
      • authentication type 
        • supported only by [2]
          • Azure Data Lake Storage
          • Azure Data Lake Storage Gen2
          • Azure Blob Storage
          • Azure Synapse Analytics
          • Azure SQL Database
          • Dataverse
          • SharePoint online
        • doesn’t support
          • SQL data source with Direct Query in datasets [2]
  • when registering a new application in Microsoft Entra ID, a SPN is automatically created for the app registration [4]
    • the access to resources is restricted by the roles assigned to the SPN
      • ⇒ gives control over which resources can be accessed and at which level [4]
    • {recommendation} use SPN with automated tools [4]
      • rather than allowing them to sign in with a user identity  [4]
    • {prerequisite} an active Microsoft Entra user account with sufficient permissions to 
      • register an application with the tenant [4]
      • assign to the application a role in the Azure subscription [4]
      •  requires Application.ReadWrite.All permission [4]
  • extended to support Fabric Data Warehouses [1]
    • {benefit} automation-friendly API Access
      • allows to create, update, read, and delete Warehouse items via Fabric REST APIs using service principals [1]
      • enables to automate repetitive tasks without relying on user credentials [1]
        • e.g. provisioning or managing warehouses
        • increases security by limiting human error
      • the warehouses thus created, will be displayed in the Workspace list view in Fabric UI, with the Owner name of the SPN [1]
      • applicable to users with administrator, member, or contributor workspace role [3]
      • minimizes risk
        • the warehouses created with delegated account or fixed identity (owner’s identity) will stop working when the owner leaves the organization [1]
          • Fabric requires the user to login every 30 days to ensure a valid token is provided for security reasons [1]
    • {benefit} seamless integration with Client Tools: 
      • tools like SSMS can connect to the Fabric DWH using SPN [1]
      • SPN provides secure access for developers to 
        • run COPY INTO
          • with and without firewall enabled storage [1]
        • run any T-SQL query programmatically on a schedule with ADF pipelines [1]
    • {benefit} granular access control
      • Warehouses can be shared with an SPN through the Fabric portal [1]
        • once shared, administrators can use T-SQL commands to assign specific permissions to SPN [1]
          • allows to control precisely which data and operations an SPN has access to  [1]
            • GRANT SELECT ON <table name> TO <Service principal name>  
      • warehouses' ownership can be changed from an SPN to user, and vice-versa [3]
    • {benefit} improved DevOps and CI/CD Integration
      • SPN can be used to automate the deployment and management of DWH resources [1]
        •  ensures faster, more reliable deployment processes while maintaining strong security postures [1]
    • {limitation} default semantic models are not supported for SPN created warehouses [3]
      • ⇒ features such as listing tables in dataset view, creating report from the default dataset don’t work [3]
    • {limitation} SPN for SQL analytics endpoints is not currently supported
    • {limitation} SPNs are currently not supported for COPY INTO error files [3]
      • ⇐ Entra ID credentials are not supported as well [3]
    • {limitation} SPNs are not supported for GIT APIs. SPN support exists only for Deployment pipeline APIs [3]
    • monitoring tools
      • [DMV] sys.dm_exec_sessions.login_name column [3] 
      • [Query Insights] queryinsights.exec_requests_history.login_name [3]
      • Query activity
        • submitter column in Fabric query activity [3]
      • Capacity metrics app: 
        • compute usage for warehouse operations performed by SPN appears as the Client ID under the User column in Background operations drill through table [3]

References:
[1] Microsoft Fabric Updates Blog (2024) Service principal support for Fabric Data Warehouse [link]
[2] Microsoft Fabric Learn (2024) Service principal support in Data Factory [link]
[3] Microsoft Fabric Learn (2024) Service principal in Fabric Data Warehouse [link
[4] Microsoft Fabric Learn (2024) Register a Microsoft Entra app and create a service principal [link]
[5] Microsoft Fabric Updates Blog (2024) Announcing Service Principal support for Fabric APIs [link]

Resources:
[R1] Microsoft Learn (2025) Fabric: What's new in Microsoft Fabric? [link
 
Acronyms:
ADF - Azure Data Factory
API - Application Programming Interface
CI/CD - Continuous Integration/Continuous Deployment
DMV - Dynamic Management View
DWH - Data Warehouse
SPN - service principal
SSMS - SQL Server Management Studio
Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 25 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.