Showing posts with label data-driven. Show all posts
Showing posts with label data-driven. Show all posts

25 September 2025

🤖〽️Prompt Engineering: Copilot Unabridged (Part 101: Why AI Is Not an Investment Bubble)

Prompt Engineering Series
Prompt Engineering Series


Prompt: "write a post of 600 words on why AI is not an investment bubble"

Introduction

Every few decades, a technology emerges that reshapes industries, redefines productivity, and rewires the way we live. Artificial Intelligence (AI) is one of those technologies. While skeptics warn of an investment bubble - comparing AI to the dot-com crash or crypto hype - the reality is far more grounded. AI is not a speculative fad; it’s a foundational shift in how businesses operate, how people interact with machines, and how knowledge is created.

Here’s why AI is not an investment bubble, but a long-term transformation.

1. Real Revenue, Real Products

Unlike past bubbles where companies were valued on vague promises, today’s AI leaders are generating substantial revenue. Microsoft, Google, NVIDIA, and OpenAI are not just building demos - they’re delivering enterprise-grade tools, cloud services, and developer platforms that are already embedded in workflows across industries.

From customer service automation to predictive analytics in supply chains, AI is solving real problems. Companies aren’t investing in hype - they’re investing in efficiency, insight, and competitive advantage.

2. Ubiquity Across Sectors

AI isn’t confined to one niche. It’s being adopted in healthcare (diagnostics, drug discovery), finance (fraud detection, algorithmic trading), manufacturing (predictive maintenance, robotics), and education (personalized learning). This cross-sector penetration is a hallmark of durable innovation.

When a technology becomes infrastructure - like electricity or the internet - it’s no longer a bubble. AI is heading in that direction, becoming a layer that powers everything from mobile apps to industrial systems.

3. Tangible Productivity Gains

AI is not just about automation - it’s about augmentation. Tools like Copilot, ChatGPT, and GitHub Copilot are helping professionals write code faster, draft documents, analyze data, and make decisions with greater precision. These aren’t theoretical benefits; they’re measurable improvements in productivity.

McKinsey estimates that generative AI could add trillions of dollars in value annually across the global economy. That’s not bubble talk - that’s economic transformation.

4. Infrastructure Is Catching Up

One reason past tech bubbles burst was the lack of supporting infrastructure. In the early 2000s, broadband wasn’t ready for streaming. Crypto lacked regulatory clarity and real-world use cases. AI, however, is supported by robust cloud platforms, powerful GPUs, and scalable data pipelines.

The ecosystem is mature enough to support widespread deployment. And with continued investment in chips, data centers, and edge computing, AI’s backbone is only getting stronger.

5. Regulatory Momentum, Not Paralysis

Governments are actively engaging with AI - not to shut it down, but to guide its development. The EU AI Act, U.S. executive orders, and global frameworks are creating guardrails for ethical use, transparency, and safety. This regulatory clarity encourages responsible innovation and long-term investment.

Rather than stifling growth, regulation is helping legitimize AI as a serious, sustainable field.

6. Investor Sophistication

Today’s investors are more data-driven and cautious than during previous bubbles. Venture capital firms are scrutinizing technical feasibility, market fit, and monetization strategies. Public markets are rewarding companies with real AI capabilities - not just those slapping 'AI' onto their pitch decks.

The capital flowing into AI is being deployed strategically, not blindly. That’s a sign of maturity, not mania.

Conclusion: A New Industrial Era

AI is not a bubble - it’s the beginning of a new industrial era. Like electricity, the internet, and mobile computing before it, AI is becoming a general-purpose technology. It’s not immune to hype or missteps, but its core value is undeniable.

Investors, businesses, and governments are betting on AI not because it’s trendy, but because it’s transformative. And while some ventures may fail, the overall trajectory is clear: AI is here to stay, and its impact will be profound.

Disclaimer: The whole text was generated by Copilot (under Windows 11) at the first attempt. This is just an experiment to evaluate feature's ability to answer standard general questions, independently on whether they are correctly or incorrectly posed. Moreover, the answers may reflect hallucinations and other types of inconsistent or incorrect reasoning.

Previous Post <<||>> Next Post

19 March 2024

𖣯Strategic Management: Inflection Points and the Data Mesh (Quote of the Day)

Strategic Management
Strategic Management Series

"Data mesh is what comes after an inflection point, shifting our approach, attitude, and technology toward data. Mathematically, an inflection point is a magic moment at which a curve stops bending one way and starts curving in the other direction. It’s a point that the old picture dissolves, giving way to a new one. [...] The impacts affect business agility, the ability to get value from data, and resilience to change. In the center is the inflection point, where we have a choice to make: to continue with our existing approach and, at best, reach a plateau of impact or take the data mesh approach with the promise of reaching new heights." [1]

I tried to understand the "metaphor" behind the quote. As the author through another quote pinpoints, the metaphor is borrowed from Andrew Groove:

"An inflection point occurs where the old strategic picture dissolves and gives way to the new, allowing the business to ascend to new heights. However, if you don’t navigate your way through an inflection point, you go through a peak and after the peak the business declines. [...] Put another way, a strategic inflection point is when the balance of forces shifts from the old structure, from the old ways of doing business and the old ways of competing, to the new." [2]

The second part of the quote clarifies the role of the inflection point - the shift from a structure, respectively organization or system to a new one. The inflection point is not when we take a decision, but when the decision we took, and the impact shifts the balance. If the data mesh comes after the inflection point (see A), then there must be some kind of causality that converges uniquely toward the data mesh, which is questionable, if not illogical. A data mesh eventually makes sense after organizations reached a certain scale and thus is likely improbable to be adopted by small to medium businesses. Even for large organizations the data mesh may not be a viable solution if it doesn't have a proven record of success. 

I could understand if the author would have said that the data mesh will lead to an inflection point after its adoption, as is the case of transformative/disruptive technologies. Unfortunately, the tracking record of BI and Data Analytics projects doesn't give many hopes for such a magical moment to happen. Probably, becoming a data-driven organization could have such an effect, though for many organizations the effects are still far from expectations. 

There's another point to consider. A curve with inflection points can contain up and down concavities (see B) or there can be multiple curves passing through an inflection point (see C) and the continuation can be on any of the curves.

Examples of Inflection Points [3]

The change can be fast or slow (see D), and in the latter it may take a long time for change to be perceived. Also [2] notes that the perception that something changed can happen in stages. Moreover, the inflection point can be only local and doesn't describe the future evolution of the curve, which to say that the curve can change the trajectory shortly after that. It happens in business processes and policy implementations that after a change was made in extremis to alleviate an issue a slight improvement is recognized after which the performance decays sharply. It's the case of situations in which the symptoms and not the root causes were addressed. 

More appropriate to describe the change would be a tipping point, which can be defined as a critical threshold beyond which a system (the organization) reorganizes/changes, often abruptly and/or irreversible.

Previous Post <<||>> Next Post

References:
[1] Zhamak Dehghani (2021) Data Mesh: Delivering Data-Driven Value at Scale (book review)
[2] Andrew S Grove (1988) "Only the Paranoid Survive: How to Exploit the Crisis Points that Challenge Every Company and Career"
[3] SQL Troubles (2024) R Language: Drawing Function Plots (Part II - Basic Curves & Inflection Points) (link)

17 March 2024

🧭Business Intelligence: Data Products (Part I: A Lego Exercise)

Business Intelligence
Business Intelligence Series

One can define a data product as the smallest unit of data-driven architecture that can be independently deployed and managed (aka product quantum) [1]. In other terms one can think of a data product like a box (or Lego piece) which takes data as inputs, performs several transformations on the data from which result several output data (or even data visualizations or a hybrid between data, visualizations and other content). 

At high-level each Data Analytics solution can be regarded as a set of inputs, a set of outputs and the transformations that must be performed on the inputs to generate the outputs. The inputs are the data from the operational systems, while the outputs are analytics data that can be anything from data to KPIs and other metrics. A data mart, data warehouse, lakehouse and data mesh can be abstracted in this way, though different scales apply. 

For creating data products within a data mesh, given a set of inputs, outputs and transformations, the challenge is to find horizontal and vertical partitions within these areas to create something that looks like a Lego structure, in which each piece of Lego represents a data product, while its color represents the membership to a business domain. Each such piece is self-contained and contains a set of transformations, respectively intermediary inputs and outputs. Multiple such pieces can be combined in a linear or hierarchical fashion to transform the initial inputs into the final outputs. 

Data Products with a Data Mesh
Data Products with a Data Mesh

Finding such a partition is possible though it involves a considerable effort, especially in designing the whole thing - identifying each Lego piece uniquely. When each department is on its own and develops its own Lego pieces, there's no guarantee that the pieces from the various domains will fit together to built something cohesive, performant, secure or well-structured. Is like building a house from modules, the pieces must fit together. That would be the role of governance (federated computational governance) - to align and coordinate the effort. 

Conversely, there are transformations that need to be replicated for obtaining autonomous data products, and the volume of such overlapping can be considerable high. Consider for example the logic available in reports and how often it needs to be replicated. Alternatively, one can create intermediary data products, when that's feasible. 

It's challenging to define the inputs and outputs for a Lego piece. Now imagine in doing the same for a whole set of such pieces depending on each other! This might work for small pieces of data and entities quite stable in their lifetime (e.g. playlists, artists, songs), but with complex information systems the effort can increase by a few factors. Moreover, the complexity of the structure increases as soon the Lego pieces expand beyond their initial design. It's like the real Lego pieces would grow within the available space but still keep the initial structure - strange constructs may result, which even if they work, change the gravity center of the edifice in other directions. There will be thus limits to grow that can easily lead to duplication of functionality to overcome such challenges.

Each new output or change in the initial input for this magic boxes involves a change of all the intermediary Lego pieces from input to output. Just recollect the last experience of defining the inputs and the outputs for an important complex report, how many iterations and how much effort was involved. This might have been an extreme case, though how realistic is the assumption that with data products everything will go smoother? No matter of the effort involved in design, there will be always changes and further iterations involved.

Previous Post <<||>> Next Post

References:
[1] Zhamak Dehghani (2021) Data Mesh: Delivering Data-Driven Value at Scale (book review

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 25 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.