11 October 2024

🧭Business Intelligence: Perspectives (Part XVII: Creating Value for Organizations)

Business Intelligence Series
Business Intelligence Series

How does one create value for an organization in BI area? This should be one of the questions the BI professional should ask himself and eventually his/her colleagues on a periodic basis because the mere act of providing reports and good-looking visualizations doesn’t provide value per se. Therefore, it’s important to identify the critical to success and value drivers within each area!

One can start with the data, BI or IT strategies, when organizations invest the time in their direction, respectively with the considered KPIs and/or OKRs defined, and hopefully the organizations already have something similar in place! However, these are just topics that can be used to get a bird view over the overall landscape and challenges. It’s advisable to dig deeper, especially when the strategic, tactical and operational plans aren’t in sync, and let’s be realistic, this happens probably in many organizations, more often than one wants to admit!

Ideally, the BI professional should be able to talk with the colleagues who could benefit from having a set of reports or dashboards that offer a deeper perspective into their challenges. Talking with each of them can be time consuming and not necessarily value driven. However, giving each team or department the chance to speak their mind, and brainstorm what can be done, could in theory bring more value. Even if their issues and challenges should be reflected in the strategy, there’s always an important gap between the actual business needs and those reflected in formal documents, especially when the latter are not revised periodically. Ideally, such issues should be tracked back to a business goal, though it’s questionable how much such an alignment is possible in practice. Exceptions will always exist, no matter how well structured and thought a strategy is!

Unfortunately, this approach also involves some risks. Despite their local importance, the topics raised might not be aligned with what the organization wants, and there can be a strong case against and even a set of negative aspects related to this. However, talking about the costs involved by losing an opportunity can hopefully change the balance favorably. In general, transposing the perspective of issues into the area of their associated cost for the organization has (hopefully) the power to change people’s minds.

Organizations tend to bring forward the major issues, addressing the minor ones only after that, this having the effect that occasionally some of the small issues increase in impact when not addressed. It makes sense to prioritize with the risks, costs and quick wins in mind while looking at the broader perspective! Quick wins are usually addressed at strategic level, but apparently seldom at tactical and operational level, and at these levels one can create the most important impact, paving the way for other strategic measures and activities.

The question from the title is not limited only to BI professionals - it should be in each manager and every employee’s mind. The user is the closest to the problems and opportunities, while the manager is the one who has a broader view and the authority to push the topic up the waiting list. Unfortunately, the waiting lists in some organizations are quite big, while not having a good set of requests on the list might pinpoint that issues might exist in other areas!  

BI professionals and organizations probably know the theory well but prove to have difficulties in combining it with praxis. It’s challenging to obtain the needed impact (eventually the maximum effect) with a minimum of effort while addressing the different topics. Sooner or later the complexity of the topic kicks in, messing things around!

17 September 2024

#️⃣Software Engineering: Mea Culpa (Part V: All-Knowing Developers are Back in Demand?)

Software Engineering Series

I’ve been reading many job descriptions lately related to my experience and curiously or not I observed that many organizations look for developers with Microsoft Dynamics experience in the CRM, respectively Finance and Operations (F&O) and Business Central (BC) areas. It’s a good sign that the adoption of Microsoft solutions for CRM and ERP increases, especially when one considers the progress made in the BI and AI areas with the introduction of Microsoft Fabric, which gives Microsoft a considerable boost. Conversely, it seems that the "developers are good for everything" syntagma is back, at least from what one reads in job descriptions. 

Of course, it’s useful to have an inhouse developer who can address all the aspects of an implementation, though that’s a lot to ask considering the different non-programming areas that need to be addressed. It’s true that a developer with experience can handle Requirements, Data and Process Management, respectively Data Migrations and Business Intelligence topics, though if one considers that each of the topics can easily become a full-time job before, during and post-project implementations. I’ve been there and I (hopefully) know that the jobs imply. Even if an experienced programmer can easily handle the different aspects, there will be also times when all the topics combined will be too much for a person!

It's not a novelty that job descriptions are treated like Christmas lists, but it’s difficult to differentiate between essential and nonessential skillset. I read many jobs descriptions lately in which among a huge list of demands, one of the requirements is to program in the F&O framework, sign that D365 programmers are in high demand. I worked for many years as programmer and Software Engineer, respectively in the BI area, where SQL and non-SQL code is needed. Even if I can understand the code in F&O, does it make sense to learn now to program in X++ and the whole framework? 

It's never too late to learn new tricks, respectively another programming language and/or framework. It even helps to provide better solutions in other areas, though frankly I would invest my time in other areas, and AI-related topics like AI prompting or Data Science seem to be more interesting in the long term, especially when they are already in demand!

There seems to be a tendency for Data Science professionals to do everything, building their own solutions, ignoring the experience accumulated respectively the data models built in BI and Data Analytics areas, as if the topics and data models are unrelated! It’s also true that AI-modeling comes with its own requirements in what concerns data modeling (e.g. translating non-numeric to numeric values), though I believe that common ground can be found!

Similarly, the notebook-based programming seems to replicate logic in each solution, which occasionally makes sense, though personally I wouldn’t recommend it as practice! The other day, I was looking at code developed in Python to mimic the joining of tables, when a view with the same could be easier (re)used, maintained, read and probably more efficient, even if different engines will be used. It will be interesting to see how the mix of spaghetti solutions will evolve over time. There are developers already complaining of the number of objects used in the process by building logic for each layer from the medallion architecture! Even if it makes sense from architectural considerations, it will become a nightmare in time.

One can wonder also about nomenclature used – Data Engineer or Prompt Engineering for the simple manipulation of data between structures in data transformations, respectively for structuring the prompts for AI. I believe that engineering involves more than this, no matter the context! 

Previous Post <<||>> Next Post

16 September 2024

🧭Business Intelligence: Mea Culpa (Part IV: Generalist or Specialist in an AI Era?)

Business Intelligence Series
Business Intelligence Series

Except the early professional years when I did mainly programming for web or desktop applications in the context of n-tier architectures, over the past 20 years my professional life was a mix between BI, Data Analytics, Data Warehousing, Data Migrations and other topics (ERP implementations and support, Project Management, IT Service Management, IT, Data and Applications Management), though the BI topics covered probably on average at least 60% of my time, either as internal or external consultant. 

I can consider myself thus a generalist who had the chance to cover most of the important aspects of a business from an IT perspective, and it was thus a great experience, at least until now! It’s a great opportunity to have the chance to look at problems, solutions, processes and the various challenges and opportunities from different perspectives. Technical people should have this opportunity directly in their jobs through the communication occurring in projects or IT services, though that’s more of a wish! Unfortunately, the dialogue between IT and business occurs almost only over the tickets and documents, which might be transparent but isn’t necessarily effective or efficient! 

Does working only part time in an area make one person less experienced or knowledgeable than other people? In theory, a full-time employee should get more exposure in depth and/or breadth, but that’s relative! It depends on the challenges one faces, the variation of the tasks, the implemented solutions, their depth and other technical and nontechnical factors like training, one’s experience in working with the various tools, the variety of the tasks and problem faced, professionalism, etc. A richer exposure can but not necessarily involve more technical and nontechnical knowledge, and this shouldn’t be taken as given! There’s no right or wrong answer even if people tend to take sides and argue over details.

Independently of job's effective time, one is forced to use his/her time to keep current with technologies or extend one’s horizon. In IT, a professional seldom can rely on what is learned on the job. Fortunately, nowadays one has more and more ways of learning, while the challenge shifts toward what to ignore, respectively better management of one’s time while learning. The topics increase in complexity and with this blogging becomes even more difficult, especially when one competes with AI content!

Talking about IT, it will be interesting to see how much AI can help or replace some of the professions or professionals. Anyway, some jobs will become obsolete or shift the focus to prompt engineering and technical reviews. AI still needs explicit descriptions of how to address tasks, at least until it learns to create and use better recipes for problem definition and solving. The bottom line, AI and its use can’t be ignored, and it can and should be used also in learning new things. It’s amazing what one can do nowadays with prompt engineering! 

Another aspect on which AI can help is to tailor the content to one’s needs. A high percentage in the learning process is spent on fishing in a sea of information for content that is worth knowing, respectively for a solution to one’s needs. AI must be able to address also some of the context without prompters being forced to give information explicitly!

AI opens many doors but can close many others. How much of one’s experience will remain relevant over the next years? Will AI have more success in addressing some of the challenges existing in people’s understanding or people will just trust AI blindly? Anyway, somebody must be smarter than AI, and here people’s collective intelligence probably can prove to be a real match. 

14 September 2024

🗄️Data Management: Data Governance (Part II: Heroes Die Young)

Data Management Series
Data Management Series

In the call for action there are tendencies in some organizations to idealize and overcharge main actors' purpose and image when talking about data governance by calling them heroes. Heroes are those people who fight for a goal they believe in with all their being and occasionally they pay the supreme tribute. Of course, the image of heroes is idealized and many other aspects are ignored, though such images sell ideas and ideals. Organizations might need heroes and heroic deeds to change the status quo, but the heroism doesn't necessarily payoff for the "heroes"! 

Sometimes, organizations need a considerable effort to change the status quo. It can be people's resistance to new, to the demands, to the ideas propagated, especially when they are not clearly explained and executed. It can be the incommensurable distance between the "AS IS" and the "TO BE" perspectives, especially when clear paths aren't in sight. It can be the lack of resources (e.g., time, money, people, tools), knowledge, understanding or skillset that makes the effort difficult. 

Unfortunately, such initiatives favor action over adequate strategies, planning and understanding of the overall context. The call do to something creates waves of actions and reactions which in the organizational context can lead to storms and even extreme behavior that ranges from resistance to the new to heroic deeds. Finding a few messages that support the call for action can help, though they can't replace the various critical for success factors.

Leading organizations on a new path requires a well-defined realistic strategy, respectively adequate tactical and operational planning that reflects organizations' specific needs, knowledge and capabilities. Just demanding from people to do their best is not enough, and heroism has chances to appear especially in this context. Unfortunately, the whole weight falls on the shoulders of the people chosen as actors in the fight. Ideally, it should be possible to spread the whole weight on a broader basis which should be considered the foundation for the new. 

The "heroes" metaphor is idealized and the negative outcome probably exaggerated, though extreme situations do occur in organizations when decisions, planning, execution and expectations are far from ideal. Ideal situations are met only in books and less in practice!

The management demands and the people execute, much like in the army, though by contrast people need to understand the reasoning behind what they are doing. Proper execution requires skillset, understanding, training, support, tools and the right resources for the right job. Just relying on people's professionalism and effort is not enough and is suboptimal, but this is what many organizations seem to do!

Organizations tend to respond to the various barriers or challenges with more resources or pressure instead of analyzing and depicting the situation adequately, and eventually change the strategy, tactics or operations accordingly. It's also difficult to do this as long an organization doesn't have the capabilities and practices of self-check, self-introspection, self-reflection, etc. Even if it sounds a bit exaggerated, an organization must know itself to overcome the various challenges. Regular meetings, KPIs and other metrics give the illusion of control when self-control is needed. 

Things don't have to be that complex even if managing data governance is a complex endeavor. Small or midsized organizations are in theory more capable to handle complexity because they can be more agile, have a robust structure and the flow of information and knowledge has less barriers, respectively a shorter distance to overcome, at least in theory. One can probably appeal to the laws and characteristics of networks to understand more about the deeper implications, of how solutions can be implemented in more complex setups.

🗄️Data Management: Data Culture (Part V: Quid nunc? [What now?])

Data Management Series
Data Management Series

Despite the detailed planning, the concentrated and well-directed effort with which the various aspects of data culture are addressed, things don't necessarily turn into what we want them to be. There's seldom only one cause but a mix of various factors that create a network of cause and effect relationships that tend to diminish or increase the effect of certain events or decisions, and it can be just a butterfly's flutter that stirs a set of chained reactions. The butterfly effect is usually an exaggeration until the proper conditions for the chaotic behavior appear!

The butterfly effect is made possible by the exponential divergence of two paths. Conversely, success needs probably multiple trajectories to converge toward a final point or intermediary points or areas from which things move on the "right" path. Success doesn't necessarily mean reaching a point but reaching a favorable zone for future behavior to follow a positive trend. For example, a sink or a cone-like structure allow water to accumulate and flow toward an area. A similar structure is needed for success to converge, and the structure results from what is built in the process. 

Data culture needs a similar structure for the various points of interest to converge. Things don't happen by themselves unless the force of the overall structure is so strong that allows things to move toward the intended path(s). Even then the paths can be far from optimal, but they can be favorable. Probably, that's what the general effort must do - bring the various aspects in the zone for allowing things to unfold. It might still be a long road, though the basis is there!

A consequence of this metaphor is that one must identify the important aspects, respectively factors that influence an organization's culture and drive them in the right direction(s) – the paths that converge toward the defined goal(s). (Depending on the area of focus one can consider that there are successions of more refined goals.)

The structure that allows things to converge is based on the alignment of the various paths and implicitly forces. Misalignment can make a force move in other direction with all the consequences deriving from this behavior. If its force is weak, probably will not have an impact over the overall structure, though that's relative and can change in time. 

One may ask for what's needed all this construct, even if it doesn’t reflect the reality. Sometimes, even a not entirely correct model can allow us to navigate the unknown. Model's intent is to depict what's needed for a initiative to be successful. Moreover, success doesn’t mean to shoot bulls eye but to be first in the zone until one's skillset enables performance.

Conversely, it's important to understand that things don't happen by themselves. At least this seems to be the feeling some initiatives let. One needs to build and pull the whole structure in the right direction and the alignment of the various forces can reduce the overall effort and increase the chances for success. Attempting to build something just because it’s written in documentation without understanding the whole picture (or something close to it) can easily lead to failure.

This doesn’t mean that all attempts that don’t follow a set of patterns are doomed to failure, but that the road will be more challenging and will probably take longer. Conversely, maybe these deviations from the optimal paths are what an organization needs to grow, to solidify the foundation on which something else can be built. The whole path is an exploration that doesn’t necessarily match what is written in books, respectively the expectations!

Previous Post <<||>> Next Post

11 September 2024

🗄️Data Management: Data Culture (Part IV: Quo vadis? [Where are you going?])

Data Management Series

The people working for many years in the fields of BI/Data Analytics, Data and Process Management probably met many reactions that at the first sight seem funny, though they reflect bigger issues existing in organizations: people don’t always understand the data they work with, how data are brought together as part of the processes they support, respectively how data can be used to manage and optimize the respective processes. Moreover, occasionally people torture the data until it confesses something that doesn’t necessarily reflect the reality. It’s even more deplorable when the conclusions are used for decision-making, managing or optimizing the process. In extremis, the result is an iterative process that creates more and bigger issues than whose it was supposed to solve!

Behind each blunder there are probably bigger understanding issues that need to be addressed. Many of the issues revolve around understanding how data are created, how are brought together, how the processes work and what data they need, use and generate. Moreover, few business and IT people look at the full lifecycle of data and try to optimize it, or they optimize it in the wrong direction. Data Management is supposed to help, and it does this occasionally, though a methodology, its processes and practices are as good as people’s understanding about data and its use! No matter how good a data methodology is, it’s as weak as the weakest link in its use, and typically the issues revolving around data and data understanding are the weakest link. 

Besides technical people, few businesspeople understand the full extent of managing data and its lifecycle. Unfortunately, even if some of the topics are treated in the books, they are too dry, need hands on experience and some thought in corroborating practices with theories. Without this, people will do things mechanically, processes being as good as the people using them, their value becoming suboptimal and hinder the business. That’s why training on Data Management is not enough without some hands-on experience!

The most important impact is however in BI/Data Analytics areas - how the various artifacts are created and used as support in decision-making, process optimization and other activities rooted in data. Ideally, some KPIs and other metrics should be enough for managing and directing a business, however just basing the decisions on a set of KPIs without understanding the bigger picture, without having a feeling of the data and their quality, the whole architecture, no matter how splendid, can breakdown as sandcastle on a shore meeting the first powerful wave!

Sometimes it feels like organizations do things from inertia, driven by the forces of the moment, initiatives and business issues for which temporary and later permanent solutions are needed. The best chance for solving many of the issues would have been a long time ago, when the issues were still small to create any powerful waves within the organizations. Therefore, a lot of effort is sometimes spent in solving the consequences of decisions not made at the right time, and that can be painful and costly!

For building a good business one needs also a solid foundation. In the past it was enough to have a good set of products that are profitable. However, during the past decade(s) the rules of the game changed driven by the acerb competition across geographies, inefficiencies, especially in the data and process areas, costing organizations on the short and long term. Data Management in general and Data Quality in particular, even if they’re challenging to quantify, have the power to address by design many of the issues existing in organizations, if given the right chance!

Previous Post <<||>> Next Post

02 September 2024

🗄️Data Management: Data Culture (Part III: A Tale of Two Cities)


One of the curious things is that as part of their change of culture organizations try to adopt a new language, to give new names to things, try to make distinction between the "AS IS" and "TO BE" states, insisting how the new image will replace the previous one. Occasionally, they even stress how bad things were in the past and how great will be in the future, trying to depict the future in vivid images. 

Even if this might work occasionally, it tends to confuse people and this not necessarily because of the language and the metaphors used, or the fact that same people were in the same positions, but the lack of belief or conviction, respectively half-hearted enthusiasm personified by the parties. To "convert" people to new philosophies one needs to believe in them or mimic that in similar terms. The lack of conviction can easily have a false effect that spreads within the organization. 

Dissociation from the past, from what an organization was, tends to increase the resistance against the new because two different images are involved. On one side there’s the attachment to the past, and even if there were mistakes made, or things didn’t go optimally, the experiences and decisions made are part of the organization, of the people who made them. People as individuals and as an organization should embrace their mistakes and good deeds altogether, learn from them, improve what is to improve and move forward. Conversely, there’s the resistance to the new, to the change, words they don’t believe in yet, the bigger picture is still fuzzy in their minds, and there can be many other reasons that don’t agree with one’s understanding. 

There are images, memories, views, decisions, objectives of the past and people need to recognize the road from what it was to what should be. One can hypothesize that embracing one’s mistake and understanding, the chain of reasoning from then and from now will help an organization transition towards the new. Awareness of one’s situation most probably will help in the transition process. Unfortunately, leaders and technology gurus tend to depict the past as negative, creating thus more negative emotions, respectively reactions in the process. The past is still part of the people, of the organization and will continue to be.

Conversely, the disassociation from the past can create more resistance to the new, and probably more unnecessary barriers. Probably, it’s easier for the gurus to build the new if the past weren’t there! Forgetting the past would be an error because there are many lessons that can be still useful. All the experience needs to be redirected in new directions. It’s more important to help people see the vision of the future, understand their missions, the paths to be followed and the challenges ahead, . 

It sounds more of a rambling from a psychology course, though organizations do have an image they want to change, to bring forth to cope with the various challenges, an image they want to reflect when needed. There are also organizations that want to change but keep their image intact, which leads to deeper conflicts. Unfortunately, changes of image involve conflicts that can become complex from what they bring forth.  

A data culture should increase people’s awareness of the present, respectively of the future, of what it takes to bridge the gap, the challenges ahead, how to embrace change, how to keep a realistic perspective, how to do a reality check, etc. Methodologies can increase people’s awareness and provide the theoretical basis, though walking the path will be a different story for everyone. 

01 September 2024

🗄️Data Management: Data Governance (Part I: No Guild of Heroes)

Data Management Series
Data Management Series

Data governance appeared around 1980s as topic though it gained popularity in early 2000s [1]. Twenty years later, organizations still miss the mark, respectively fail to understand and implement it in a consistent manner. As usual, the reasons for failure are multiple and they vary from misunderstanding what governance is all about to poor implementation of methodologies and inadequate management or leadership. 

Moreover, methodologies tend to idealize the various aspects and is not what organizations need, but pragmatism. For example, data governance is not about heroes and heroism [2], which can give the impression that heroic actions are involved and is not the case! Actions for the sake of action don’t necessarily lead to change by themselves. Organizations are in general good at creating meaningless action without results, especially when people preoccupy themselves, miss or ignore the mark. Big organizations are very good at generating actions without effects. 

People do talk to each other, though they try to solve their own problems and optimize their own areas without necessarily thinking about the bigger picture. The problem is not necessarily communication or the lack of depth into business issues, people do communicate, know the issues without a business impact assessment. The challenge is usually in convincing the upper management that the effort needs to be consolidated, supported, respectively the needed resources made available. 

Probably, one of the issues with data governance is the attempt of creating another structure in the organization focused on quality, which has the chances to fail, and unfortunately does fail. Many issues appear when the structure gains weight and it becomes a separate entity instead of being the backbone of organizations. 

As soon organizations separate the data governance from the key users, management and the other important decisional people in the organization, it takes a life of its own that has the chances to diverge from the initial construct. Then, organizations need "alignment" and probably other big words to coordinate the effort. Also such constructs can work but they are suboptimal because the forces will always pull in different directions.

Making each manager and the upper management responsible for governance is probably the way to go, though they’ll need the time for it. In theory, this can be achieved when many of the issues are solved at the lower level, when automation and further aspects allow them to supervise things, rather than hiding behind every issue. 

When too much mircomanagement is involved, people tend to busy themselves with topics rather than solve the issues they are confronted with. The actual actors need to be empowered to take decisions and optimize their work when needed. Kaizen, the philosophy of continuous improvement, proved itself that it works when applied correctly. They’ll need the knowledge, skills, time and support to do it though. One of the dangers is however that this becomes a full-time responsibility, which tends to create a separate entity again.

The challenge for organizations lies probably in the friction between where they are and what they must do to move forward toward the various objectives. Moving in small rapid steps is probably the way to go, though each person must be aware when something doesn’t work as expected and react. That’s probably the most important aspect. 

So, the more functions are created that diverge from the actual organization, the higher the chances for failure. Unfortunately, failure is visible in the later phases, and thus self-awareness, self-control and other similar “qualities” are needed, like small actors that keep the system in check and react whenever is needed. Ideally, the employees are the best resources to react whenever something doesn’t work as per design. 

Previous Post <<||>> Next Post 

Resources:
[1] Wikipedia (2023) Data Management [link]
[2] Tiankai Feng (2023) How to Turn Your Data Team Into Governance Heroes [link]


22 August 2024

🧭Business Intelligence: Perspectives (Part XV: From Data to Storytelling III)

Business Intelligence Series
Business Intelligence Series 

As children we heard or later read many stories, and even if few remained imprinted in memory, we can still recognize some of the metaphors and ideas used. Stories prepared us for life, and one can suppose that the business stories we hear nowadays have similar intent, charge and impact. However, if we dig deeper into each story and dissect it, we may be disappointed by its simplicity, the resemblance to other stories, to what we've heard over time. Moreover, stories can bring also negative connotations, that can impact any other story we hear. 

From the scores or hundreds of distinct stories that have been told, few reach a magnitude that can become more than the stories themselves, few become a catalyst for the auditorium, and even then they tend to manipulate. Conversely, well-written transformative stories can move mountains when they resonate with the auditorium. In a leader’s motivational speech such stories can become a catalyst that moves people in the intended direction.

Children stories are quite simple and apparently don’t need special constructs even if the choice of words, structure and messages is important. Moving further into organizations, storytelling becomes more complex, upon case, structures and messages need to follow certain conventions within some politically correct scripts. Facts become important to the degree they serve the story, though the purposes they serve change with time, becoming secondary to the story. Storytelling becomes thus just of way of changing the facts as seems fit to the storyteller. 

Storytelling has its role in organizations for channeling the multitude of messages across various structures. However, the more one hears the word storytelling, the more likely one is closer to fiction than to business decision-making. It's also true that the word in itself carries a power we all tasted during childhood and why not much later. The word has a magic power that appeals to our memories, to our feelings, to our expectations. However, as soon one's expectations are not met, the fight with the chimeras turns into a battle of our own. Yes, storytelling has great power when used right, when there's a story to tell, when the business narratives are worth telling. 

The problem with stories is that no matter how much they are based on real facts or happenings, they become fictitious in time, to the degree that they lose some of the most important facts they were based on. That’s valid especially when there’s no written track of the story, though even then various versions of the story can multiply outside of the standard channels and boundaries. 

Even if the author tried to keep the story as close to the facts, the way stories are understood, remembered and retold depend on too many factors - the words used, the degree to which metaphors and similar elements are understood, remembered and transmitted correctly, the language used, the mental structure existing in the auditorium, the association of words, ideas or metaphors, etc.

Unfortunately, the effect of stories can be negative too, especially when stories are designed to manipulate the auditorium beyond any ethical norms. When they don’t resonate with the crowd or are repeated unnecessary, the narratives may have adverse effects and the messages can get lost in the crowd or create resistance. Moreover, stories may have a multifold and opposite effect within different segments of the auditorium. 

Storytelling can make hearts and minds resonate with the carried messages, though misdirected, improper or poorly conceived stories have also the power to destroy all that have been built over the years. Between the two extremes there’s a small space to send the messages across!

21 August 2024

🧭Business Intelligence: Perspectives (Part XIV: From Data to Storytelling II)

Business Intelligence Series

Being snapshots in people and organizations’ lives, data arrive to tell a story, even if the story might not be worth telling or might be important only in certain contexts. In fact each record in a dataset has the potential of bringing a story to life, though business people are more interested in the hidden patterns and “stories” the data reveal through more or less complex techniques. Therefore, data are usually tortured until they confess something, and unfortunately people stop analyzing the data with the first confession(s). 

Even if it looks like torture, data need to be processed to reveal certain characteristics, trends or patterns that could help us in sense-making, decision-making or similar specific business purposes. Unfortunately, the volume of data increases with an incredible velocity to which further characteristics like variety, veracity, volume, velocity, value, veracity and variability may add up. 

The data in a dashboard, presentation or even a report should ideally tell a story otherwise the data might not be worthy looking at, at least from some people’s perspective. Probably, that’s one of the reason why man dashboards remain unused shortly after they were made available, even if considerable time and money were invested in them. Seeing the same dull numbers gives the illusion that nothing changed, that nothing is worth reviewing, revealing or considering, which might be occasionally true, though one can’t take this as a rule! Lot of important facts could remain hidden or not considered. 

One can suppose that there are businesses in which something important seldom happens and an alert can do a better job than reviewing a dashboard or a report frequently. Probably an alert is a better choice than reporting metrics nobody looks at! 

Organizations usually define a set of KPIs (key performance indicators) and other types of metrics they (intend to) review periodically. Ideally, the numbers collected should define and reflect the critical points (aka pain points) of an organization, if they can be known in advance. Unfortunately, in dynamic businesses the focus can change considerably from one day to another. Moreover, in systemic contexts critical points can remain undiscovered in time if the set of metrics defined doesn’t consider them adequately. 

Typically only one’s experience and current or past issues can tell what one should consider or ignore, which are the critical/pain points or important areas that must be monitored. Ideally, one should implement alerts for the critical points that require a immediate response and use KPIs for the recurring topics (though the two approaches may overlap). 

Following the flow of goods, money and other resources one can look at the processes and identify the areas that must be monitored, prioritize them and identify the metrics that are worth tracking, respectively that reflect strengths, weaknesses, opportunities, threats and the risks associated with them. 

One can start with what changed by how much, what caused the change(s) and what further impact is expected directly or indirectly, by what magnitude, respectively why nothing changed in the considered time unit. Causality diagrams can help in the process even if the representations can become quite complex. 

The deeper one dives and the more questions one attempts to answer, the higher the chances to find a story. However, can we find a story that’s worth telling in any set of data? At least this is the point some adepts of storytelling try to make. Conversely, the data can be dull, especially when one doesn’t track or consider the right data. There are many aspects of a business that may look boring, and many metrics seem to track the boring but probably important aspects. 

18 August 2024

🧭Business Intelligence: Mea Culpa (Part III: Problem Solving)

Business Intelligence Series
Business Intelligence Series

I've been working for more than 20 years in BI and Data Analytics area, in combination with Software Engineering, ERP implementations, Project Management, IT services and several other areas, which allowed me to look at many recurring problems from different perspectives. One of the things I learnt is that problems are more complex and more dynamic than they seem, respectively that they may require tailored dynamic solutions. Unfortunately, people usually focus on one or two immediate perspectives, ignoring the dynamics and the multilayered character of the problems!

Sometimes, a quick fix and limited perspective is what we need to get started and fix the symptoms, and problem-solvers usually stop there. When left unsupervised, the problems tend to kick back, build up momentum and appear under more complex forms in various places. Moreover, the symptoms can remain hidden until is too late. To this also adds the political agendas and the further limitations existing in organizations (people, money, know-how, etc.).

It seems much easier to involve external people (individual experts, consultancy companies) to solve the problem(s), though unless they get a deep understanding of the business and the issues existing in it, the chances are high that they solve the wrong problems and/or implement the wrong solutions. Therefore, it's more advisable to have internal experts, when feasible, and that's the point where business people with technical expertise and/or IT people with business expertise can help. Ideally, one should have a good mix and the so called competency centers can do a great job in handling the challenges of organizations. 

Between business and IT people there's a gap that can be higher or lower depending on resources know-how or the effort made by organizations to reduce it. To this adds the nature of the issues existing in organizations, which can vary considerable across departments, organizations or any other form of establishment. Conversely, the specific skillset can be transmuted where needed, which might happen naturally, though upon case also considerable effort needs to be involved in the process.

Being involved in similar tasks, one may get the impression that one can do whatever the others can do. This can happen in IT as well on the business side. There can be activities that can be done by parties from the other group, though there are also many exceptions in both directions, especially when one considers that one can’t generalize the applicability and/or transmutation of skillset. 

A more concrete example is the know-how needed by a businessperson to use the BI infrastructure for answering business questions, and ideally for doing all or at least most of the activities a BI professional can do. Ideally, as part of the learning path, it would be helpful to have a pursuable path in between the two points. The mastery of tools helps in the process though there are different mindsets involved.

Unfortunately, the data-related fields are full of overconfident people who get the problem-solving process wrong. Data-based problem-solving resumes in gathering the right facts and data, building the right conceptual model, identifying the right questions to ask, collecting more data, refining methods and solutions, etc. There’s aways an easy wrong way to solve a problem!

The mastery of tools doesn’t imply the mastery of business domains! What people from the business side can bring is deeper insight in the business problems, though getting from there to implementing solutions can prove a long way, especially when problems require different approaches, different levels of approximations, etc. No tool alone can bridge such gaps yet! Frankly, this is the most difficult to learn and unfortunately many data professionals seem to get this wrong!

Previous Post <<||>> Next Post

16 August 2024

🧭Business Intelligence: Perspectives (Part XIII: From Data to Storytelling I)

Business Intelligence Series
Business Intelligence Series

Data is an amalgam of signs, words, numbers and other visual or auditory elements used together to memorize, interpret, communicate and do whatever operation may seem appropriate with them. However, the data we use is usually part of one or multiple stories - how something came into being, what it represents, how is used in the various mental and non-mental processes - respectively, the facts, concepts, ideas, contexts places or other physical and nonphysical elements that are brought in connection with.

When we are the active creators of a story, we can in theory easily look at how the story came into being, the data used and its role in the bigger picture, respective the transformative elements considered or left out, etc. However, as soon we deal with a set of data, facts, or any other elements of a story we are not familiar with, we need to extrapolate the hypothetical elements that seem to be connected to the story. We need to make sense of these elements and consider all that seems meaningful, what we considered or left out shaping the story differently. 

As children and maybe even later, all of us dealt with stories in one way or another, we all got fascinated by metaphors' wisdom and felt the energy that kept us awake, focused and even transformed by the words coming from narrator's voice, probably without thinking too much at the whole picture, but letting the words do their magic. Growing up, the stories grew in complexity, probably became richer in meaning and contexts, as we were able to decipher the metaphors and other elements, as we included more knowledge about the world around, about stories and storytelling.

In the professional context, storytelling became associated with our profession - data, information, knowledge and wisdom being created, assimilated and exchanged in more complex processes. From, this perspective, data storytelling is about putting data into a (business) context to seed cultural ground, to promote decision making and better understanding by building a narrative around the data, problems, challenges, opportunities, and further organizational context.

Further on, from a BI's perspective, all these cognitive processes impact on how data, information and knowledge are created, (pre)processed, used and communicated in organizations especially when considering data visualizations and their constituent elements (e.g. data, text, labels, metaphors, visual cues), the narratives that seem compelling and resonate with the auditorium. 

There's no wonder that data storytelling has become something not to neglect in many business contexts. Storytelling has proved that words, images and metaphors can transmit ideas and knowledge, be transformative, make people think, or even act without much thinking. Stories have the power to seed memes, ideas, or more complex constructs into our minds, they can be used (for noble purposes) or misused. 

A story's author usually takes compelling images, metaphors, and further elements, manipulates them to the degree they become interesting to himself/herself, to the auditorium, to the degree they are transformative and become an element of the business vocabulary, respectively culture, without the need to reiterate them when needed to bring more complex concepts, ideas or metaphors into being.  

A story can be seen as a replication of the constituting elements, while storytelling is a set of functions that operate on them and change the initial structure and content into something that might look or not like the initial story. Through retelling and reprocessing in any form, the story changes independently of its initial form and content. Sometimes, the auditorium makes connections not recognized or intended by the storyteller. Other times, the use and manipulation of language makes the story change as seems fit. 

Previous Post <<||>>  Next Post

07 August 2024

🧭Business Intelligence: Perspectives (Part XII: From Data to Data Models)

Business Intelligence Series
Business Intelligence Series

A data model can be defined as an abstract, self-contained, logical definition of the data structures available in a database or similar repositories. It’s typically an abstraction of the data structures underpinning a set of processes, procedures and business logic used for a predefined purpose. A data model can be formed also of unrelated micromodels, depicting thus various aspects of a business. 

The association between data and data models is bidirectional. Given a set of data, a data model can be built to underpin the respective data. Conversely, one can create or generate data based on a data model. However, in business setups a bidirectional relationship between data and the data model(s) underpinning them is more realistic as the business evolves. In extremis, the data model can be used to reflect a business’ needs, at least when the respective needs are addressed accordingly by extending the data model(s).

Given a set of data (e.g. the data stored in one or more spreadsheets or other type of files) there can be defined in theory multiple data models to reflect the respective data. Within a data model, the fields (aka attributes) are partitioned into a set of data entities, where a data entity is thus a nonunique grouping of attributes that attempt to define together one unitary aspect of the world. Customers, Vendors, Products, Invoices or Sales Orders are examples of such data entities, though entities can have a broader granularity (e.g. Customers can be modeled over several tables like Entity, Addresses, Contact information, etc.). 

From an operational database’s perspective, a data entity is based on one or more tables, though several entities can share some of the tables. From a BI artifact’s perspective, an entity should be easy to create from the underlying tables, with a minimal set of transformations. Ideally, the BI data model should be as close as possible to the needed entity for reporting, however an optimal solution lies usually somewhere in between. In this resides the complexity of modeling BI solutions – providing an optimal data model which can be easily built on the source tables, and which allows addressing all or at least most of the BI requirements.

In other words, we deal with two optimization problems of two distinct data models. On one side the business data model must be flexible enough to provide fast read/write operations while keeping the referential data’s granularity efficient. Conversely, a BI data model needs to abstract these entities and provide a fast way of processing the data, while making data reads extremely efficient. These perspectives must apply when we move to Microsoft Fabric too. 

The operational data layer must provide this abstraction, and in this resides the complexity of building optimal BI solutions. This is the layer at which the modeling problems need to be tackled. The challenge of BI and Analytics resides in finding an optimal data model that allows us to address most or ideally all the BI requirements. Several overlapping layers of abstraction may be built in the process.

Looking at the data modeling techniques used in notebooks and other similar solutions, data modeling has the chance of becoming a redundant practice prone to errors. Moreover, data models have a tendency of being multilayered and of being based on certain perspectives into the processes they model. Providing reliable flexible models involves finding the right view into the data for modeling aspects of the business. Database views allow us to easily model such perspectives, often in a unique way. Moving away from them just shifts the burden on the multiple solutions built around the base data, which can create other important challenges. 

Previous Post <<||>>  Next Post

06 August 2024

🧭Business Intelligence: Perspectives (Part XVI: On the Cusps of Complexity)

Business Intelligence Series
Business Intelligence Series

We live in a complex world, which makes it difficult to model and work with the complex models that attempt to represent it. Thus, we try to simplify it to the degree that it becomes processable and understandable for us, while further simplification is needed when we try to depict it by digital means that make it processable by machines, respectively by us. Whenever we simplify something, we lose some aspects, which might be acceptable in many cases, but create issues in a broader number of ways.

With each layer of simplification results a model that addresses some parts while ignoring some parts of it, which restricts models’ usability to the degree that makes them unusable. The more one moves toward the extremes of oversimplification or complexification, the higher the chances for models to become unusable.

This aspect is relevant also in what concerns the business processes we deal with. Many processes are oversimplified to the degree that we track the entry and exit points, respectively the quantitative aspects we are interested in. In theory this information should be enough when answering some business questions, though might be insufficient when one dives deeper into processes. One can try to approximate, however there are high chances that such approximations deviate too much from the value approximated, which can lead to strange outcomes.

Therefore, when a date or other values are important, organizations consider adding more fields to reflect the implemented process with higher accuracy. Unfortunately, unless we save a history of all the important changes in the data, it becomes challenging to derive the snapshots we need for our analyses. Moreover, it is more challenging to obtain consistent snapshots. There are systems which attempt to obtain such snapshots through the implementation of the processes, though also this approach involves some complexity and other challenges.

Looking at the way business processes are implemented (see ERP, CRM and other similar systems), the systems track the created, modified and a few other dates that allow only limited perspectives. The fields typically provide the perspectives we need for data analysis. For many processes, it would be interesting to track other events and maybe other values taken in between.

There is theoretical potential in tracking more detailed data, but also a complexity that’s difficult to transpose into useful information about the processes themselves. Despite tracking more data and the effort involved in such activities, processes can still behave like black boxes, especially when we have no or minimal information about the processes implemented in Information Systems.

There’s another important aspect - even if systems provide similar implementations of similar processes, the behavior of users can make an important difference. The best example is the behavior of people entering the relevant data only when a process closes and ignoring the steps happening in between (dates, price or quantity changes).

There is a lot of missing data/information not tracked by such a system, especially in what concerns users’ behavior. It’s true that such behavior can be tracked to some degree, though that happens only when data are modified physically. One can suppose that there are many activities happening outside of the system.

The data gathered represents only the projection of certain events, which might not represent accurately and completely the processes or users’ behavior. We have the illusion of transparency, though we work with black boxes. There can be a lot of effort happening outside of these borders.  

Fortunately, we can handle oversimplified processes and data maintenance, though one can but wonder how many important things can be found beyond the oversimplifications we work with, respectively what we miss in the process. 

Previous Post <<||>>  Next Post

04 August 2024

📊Graphical Representation: Graphics We Live By (Part X: Pie and Donut Charts in Power BI and Excel)

Graphical Representation Series

Pie charts are loved and hated by many altogether, and there are many entitled reasons to use them and avoid them, though the most important criteria to evaluate them is whether they do the intended job in an acceptable manner, especially when compared to other representational means. The most important aspect they depict is the part to whole ratio, which even if can be depicted by other graphical tools, few tools are efficient in representing it. 

The pie chart works well as a visualization tool when it has only 3-5 values that are easily recognizable in the visualization, however as soon the size or the number of pieces vary considerably, the more difficult it is to visualize and interpret them, in case their representation has more negative than positive effects. There are many topics that form something like a long tail - the portion of the distribution having many occurrences far from the head or beginning. Displaying the items from the long tail together with the other components together can totally obscure the distribution of the items from the long tail as they become unrecognizable in the diagram. 

One approach to handle this is to group all the items from the long tail together under a piece (e.g. Other) and use a second form of representation to display them separately. For example,  Microsoft Excel offers a way to zoom in the section of a pie chart with small percentages by displaying them in a second pie chart (pie of pie) or bar chart (bar of pie), something like a "zoom in" perspective (see image below). Unfortunately, the feature seems to limit itself only to small percentages, and thus can't be used currently to offer a broader perspective. Ideally, it would be useful to zoom in on any piece of the pie, especially when the items are categorized as a hierarchy with two or even more levels. 


Unfortunately, even modern visualization tools offer limited features in displaying this kind of perspective into a flexible unitary visualization, and thus users are forced to use their creativity in providing proper solutions. In the below example the "Renewables" piece of pie is further broken down into several components of a full pie, an ensemble supposed to function as a single form of representation. With a bit of effort, the reader probably will understand the meaning behind the two pie charts, however the encoding of colors and other elements used are suboptimal in the decoding process. 

Pie Charts - Original Solution

In the above example, the arrow may suggest that in between the two donut charts exists a relationship, reflected also in the description provided, however the readers may still have difficulties in correctly interpreting the diagrams, especially when there's some kind of overlapping or other type of implied or unimplied resemblance. If the colors overlap or have other similarities, are they intentional? If the circles have the same size, does this observed resemblance have a meaning? The reader shouldn't bother himself with this type of questions, but see the resemblance and the meaning of the various elements with a minimum of effort while decoding a chart's elements. Of course, when the meaning is not clear, some guidance should be ideally provided!

Unfortunately, Power BI doesn't seem to have a similar visual like the one from Excel yet, however with a bit of effort one can obtain similar results, even if there are other minor or important limitations. For example, the lines between the two pie charts can't be drawn, so one is forced to use other encodings to show that there's a connection between the Renewable slice and the small pie chart. Moreover, the ensemble thus created isn't treated unitary and handled accordingly. Frankly, the maturity of a graphical representation environment can and should be judged also from this perspective!

The below representation built in Power BI uses a few tricks to display two pie charts together. The smaller pie chart representing the breakdown and pieces' colors are variations of parent's color, attempting to show that there's a relationship between the slice from the first chart and the pie chart with the details. Unfortunately, it wasn't possible to use similar lines like in Excel to show the relation between the two sections. 

Pie of Pie in Power BI

Instead of a pie chart, one can use a donut, like in the original representation. Even if the donut uses a smaller area for representation, in theory the pie chart offers a better basis for comparisons, at least in theory. Stacked column charts can be used as well (see C), however one loses the certainty that the pieces must add up to 100%. Further limitations can appear when one wants to achieve more with the visualizations.

Custom charts can be used as well. The pie chart coming from xViz (see D) allows to increase the size of a pie piece by using another radius, technique which could be used to highlight the piece represented in the second chart. Frankly, sunburst diagrams (see E) are better at representing the parent to child proportions, where the same color encoding has been used. Unfortunately, the more information is shown, the more loaded the visualization seems to be.

Pie of Pie Alternatives in Power BI I

A treemap can prove to be a better representation alternative because it encodes proportions in a unitary way, much like pie charts do, though it takes more space if one wants to make the labels visible. Radial charts (see G) and Aster plots (see I) can be occasionally better choices, especially because they use less space as they display only the main categories. A second diagram chart can be used to display the subcategories, much like in A and B. Sankey charts (see H) can be used as well, even if they don't allow representing any quantitative values unless one encodes them directly in the labels. 

Pie of Pie Alternatives in Power BI II

When one dives into the world of diagrams and goes behind the still limited representational choices provided by the standard tools, one can be surprised by the additional representational choices. However, their appropriateness should be considered against readers' skillset to read and interpret them! Frankly, the alternatives considered above could be a better choice when they will reach a representational maturity. 

Many thanks to Christopher Chin, who in his weekly post on data visualization blunders, suggested the examples used as basis for this post (see [1])!

Previous Post <<||>> Next Post

References:
[1] LinkedIn (2024) Christopher Chin's post (link)

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.