Showing posts with label risk management. Show all posts
Showing posts with label risk management. Show all posts

09 April 2024

🧭Business Intelligence: Why Data Projects Fail to Deliver Real-Life Impact (Part IV: Making It in the Statistics)

Business Intelligence
Business Intelligence Series

Various sources (e.g., [1], [2], [3]) advance the failure rates for data projects somewhere between 70% and 85%, rates which are a bit higher than the failure of standard projects estimated at 60-75% but not by much. This means that only 2-3 out of 10 projects will succeed and that’s another reason to plan for failure, respectively embrace the failure

Unfortunately, the statistics advanced on project failure have no solid fundament and should be regarded with circumspection as long the methodology and information about the population used for the estimates aren’t shared, though they do reflect an important point – many data projects do fail! It would be foolish to think that your project will not fail just because you’re a big company, and you have the best resources, and you have a proven rate of success, and you took all the precautions for the project not to fail.

Usually at the end of a project the team meets together to document the lessons learned in the hope that the next projects will benefit from them. The team did learn something, though as the practice shows even if the team managed to avoid some issues, other issues will impact the next similar project, leading to similar variances. One can summarize this as "on the average the impact of new issues and avoided known issues tends to zero out" or "on average, the plusses and minuses balance each other across projects". It’s probably a question of focus – if organizations focus too much on certain aspects, other aspects are ignored and/or unseen. 

So, your first data project will more likely fail. The question is: what do you do about it? It’s important to be aware of why projects and data projects fail, though starting to consider and monitor each possible issue can prove to be ineffective. One can, however, create a risk register from the list and estimate the rates for each of the potential failures, respectively focus on only the top 3-5 which have the highest risk. Of course, one should reevaluate the estimates on a regular basis though that’s Risk Management 101. 

Besides this, one should focus on how the team can make the project succeed. When adopting a technology, methodology or set of processes, it’s recommended to start with a proof-of-concept (PoC). To make the PoC a helpful experience it’s probably important to start with a topic that’s not too big to handle, but that also involves some complexity that would allow the organization to evaluate the targeted set of tools and technologies. It can also be a topic for which other organizations have made important progress, respectively succeed. The temptation is big to approach the most stringent issues in the organization, respectively to build something big that can have an enormous impact for the organization. Jumping too soon into such topics can just increase the chances of failure. 

One can also formulate the goals, objectives and further requirements in a form that allows the organization to build upon them even if the project fails. A PoC is about learning, building a foundation, doing the groundwork, exploring, mapping the unknown, and identifying what's still missing to make progress, respectively closing the full circle. A PoC is less about overachievement and a big impact, which can happen, though is a consequence of the good work done in the PoC. 

The bottom line, no matter whether you succeed or fail, once you start a project, you’ll still make it in the statistics! More important is what you’ve learnt after the first data project, respectively how you can use the respective knowledge in further projects to make a difference!

Previous Post <<||>> Next Post

References:
[1] Harvard Business Review (2023) Keep Your AI Projects on Track, by Iavor Bojinov (link)
[2] Cognilytica (2023) The Shocking Truth: 70-80% of AI Projects Fail! (link)
[3] VentureBeat (2019) Why do 87% of data science projects never make it into production? (link)

26 November 2018

🔭Data Science: Risk (Just the Quotes)

"A deterministic system is one in which the parts interact in a perfectly predictable way. There is never any room for doubt: given a last state of the system and the programme of information by defining its dynamic network, it is always possible to predict, without any risk of error, its succeeding state. A probabilistic system, on the other hand, is one about which no precisely detailed prediction can be given. The system may be studied intently, and it may become more and more possible to say what it is likely to do in any given circumstances. But the system simply is not predetermined, and a prediction affecting it can never escape from the logical limitations of the probabilities in which terms alone its behaviour can be described." (Stafford Beer, "Cybernetics and Management", 1959)

"It is easy to obtain confirmations, or verifications, for nearly every theory - if we look for confirmations. Confirmations should count only if they are the result of risky predictions. […] A theory which is not refutable by any conceivable event is non-scientific. Irrefutability is not a virtue of a theory (as people often think) but a vice. Every genuine test of a theory is an attempt to falsify it, or refute it." (Karl R Popper, "Conjectures and Refutations: The Growth of Scientific Knowledge", 1963)

"Statistical hypothesis testing is commonly used inappropriately to analyze data, determine causality, and make decisions about significance in ecological risk assessment,[...] It discourages good toxicity testing and field studies, it provides less protection to ecosystems or their components that are difficult to sample or replicate, and it provides less protection when more treatments or responses are used. It provides a poor basis for decision-making because it does not generate a conclusion of no effect, it does not indicate the nature or magnitude of effects, it does address effects at untested exposure levels, and it confounds effects and uncertainty[...]. Risk assessors should focus on analyzing the relationship between exposure and effects[...]."  (Glenn W Suter, "Abuse of hypothesis testing statistics in ecological risk assessment", Human and Ecological Risk Assessment 2, 1996)

"Until we can distinguish between an event that is truly random and an event that is the result of cause and effect, we will never know whether what we see is what we'll get, nor how we got what we got. When we take a risk, we are betting on an outcome that will result from a decision we have made, though we do not know for certain what the outcome will be. The essence of risk management lies in maximizing the areas where we have some control over the outcome while minimizing the areas where we have absolutely no control over the outcome and the linkage between effect and cause is hidden from us." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Overcoming innumeracy is like completing a three-step program to statistical literacy. The first step is to defeat the illusion of certainty. The second step is to learn about the actual risks of relevant events and actions. The third step is to communicate the risks in an understandable way and to draw inferences without falling prey to clouded thinking. The general point is this: Innumeracy does not simply reside in our minds but in the representations of risk that we choose." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"The goal of random sampling is to produce a sample that is likely to be representative of the population. Although random sampling does not guarantee that the sample will be representative, it does allow us to assess the risk of an unrepresentative sample. It is the ability to quantify this risk that will enable us to generalize with confidence from a random sample to the corresponding population." (Roxy Peck et al, "Introduction to Statistics and Data Analysis" 4th Ed., 2012)

"Decision trees are an important tool for decision making and risk analysis, and are usually represented in the form of a graph or list of rules. One of the most important features of decision trees is the ease of their application. Being visual in nature, they are readily comprehensible and applicable. Even if users are not familiar with the way that a decision tree is constructed, they can still successfully implement it. Most often decision trees are used to predict future scenarios, based on previous experience, and to support rational decision making." (Jelena Djuris et al, "Neural computing in pharmaceutical products and process development", Computer-Aided Applications in Pharmaceutical Technology, 2013)

"Without context, data is useless, and any visualization you create with it will also be useless. Using data without knowing anything about it, other than the values themselves, is like hearing an abridged quote secondhand and then citing it as a main discussion point in an essay. It might be okay, but you risk finding out later that the speaker meant the opposite of what you thought." (Nathan Yau, "Data Points: Visualization That Means Something", 2013)

"The more complex the system, the more variable (risky) the outcomes. The profound implications of this essential feature of reality still elude us in all the practical disciplines. Sometimes variance averages out, but more often fat-tail events beget more fat-tail events because of interdependencies. If there are multiple projects running, outlier (fat-tail) events may also be positively correlated - one IT project falling behind will stretch resources and increase the likelihood that others will be compromised." (Paul Gibbons, "The Science of Successful Organizational Change",  2015)

"Roughly stated, the No Free Lunch theorem states that in the lack of prior knowledge (i.e. inductive bias) on average all predictive algorithms that search for the minimum classification error (or extremum over any risk metric) have identical performance according to any measure." (N D Lewis, "Deep Learning Made Easy with R: A Gentle Introduction for Data Science", 2016)

"Premature enumeration is an equal-opportunity blunder: the most numerate among us may be just as much at risk as those who find their heads spinning at the first mention of a fraction. Indeed, if you’re confident with numbers you may be more prone than most to slicing and dicing, correlating and regressing, normalizing and rebasing, effortlessly manipulating the numbers on the spreadsheet or in the statistical package - without ever realizing that you don’t fully understand what these abstract quantities refer to. Arguably this temptation lay at the root of the last financial crisis: the sophistication of mathematical risk models obscured the question of how, exactly, risks were being measured, and whether those measurements were something you’d really want to bet your global banking system on." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

"Behavioral finance so far makes conclusions from statics not dynamics, hence misses the picture. It applies trade-offs out of context and develops the consensus that people irrationally overestimate tail risk (hence need to be 'nudged' into taking more of these exposures). But the catastrophic event is an absorbing barrier. No risky exposure can be analyzed in isolation: risks accumulate. If we ride a motorcycle, smoke, fly our own propeller plane, and join the mafia, these risks add up to a near-certain premature death. Tail risks are not a renewable resource." (Nassim N Taleb, "Statistical Consequences of Fat Tails: Real World Preasymptotics, Epistemology, and Applications" 2nd Ed., 2022)

"Any time you run regression analysis on arbitrary real-world observational data, there’s a significant risk that there’s hidden confounding in your dataset and so causal conclusions from such analysis are likely to be (causally) biased." (Aleksander Molak, "Causal Inference and Discovery in Python", 2023)

"[Making reasoned macro calls] starts with having the best and longest-time-series data you can find. You may have to take some risks in terms of the quality of data sources, but it amazes me how people are often more willing to act based on little or no data than to use data that is a challenge to assemble." (Robert J Shiller)

27 November 2016

♟️Strategic Management: Risk (Just the Quotes)

"The decision which achieves organization objectives must be both (1) technologically sound and (2) carried out by people. If we lose sight of the second requirement or if we assume naively that people can be made to carry out whatever decisions are technically sound - we run the risk of decreasing rather than increasing the effectiveness of the organization." (Douglas McGregor, "The Human Side of Enterprise", 1960)

"But the greater the primary risk, the safer and more careful your secondary assumptions must be. A project is only as sound as its weakest assumption, or its largest uncertainty." (Robert Heller, "The Naked Manager: Games Executives Play", 1972)

"Management theory is obsessed with risks. Top executives bemoan the lack of risk-taking initiative among their young. Politicians and stockholders are advised (by directors) to make directors rich, so that they can afford to take risks. Theorists teach how to construct decision trees, heraldic devices of scientific management; and how to marry the trees with probability theory, so that the degree of risk along each branch (each branch and twig representing alternative results of alternative courses of action) can be metered. But the measuring is spurious, and, anyway, the best management doesn't take risks. It avoids them. It goes for the sure thing.(Robert Heller, "The Naked Manager: Games Executives Play", 1972)

"Taking no action to solve these problems is equivalent of taking strong action. Every day of continued exponential growth brings the world system closer to the ultimate limits of that growth. A decision to do nothing is a decision to increase the risk of collapse." (Donella Meadows et al, "The Limits to Growth", 1972) 

"Overly optimistic goals nearly always result in one of two extremes. If the goal is seen as a must, then the division manager must 'go for broke. This can result in reckless risk taking. More commonly [...] ultraconservative action. The reasoning is: "Why take any chances to achieve an unattainable goal."(Bruce Henderson, "Henderson on Corporate Strategy", 1979)

"Risk is a function of how poorly a strategy will perform if the 'wrong' scenario occurs." (Michael Porter, "Competitive Advantage: Creating and Sustaining Superior Performance", 1985)

"The risk of making a decision that's wrong is so enormous that sometimes it just crushes people so that they can't make any decision at all because they're afraid of making the wrong decision." (James M McPherson, "An Exchange With a Civil War Historian", 1995)

"Until we can distinguish between an event that is truly random and an event that is the result of cause and effect, we will never know whether what we see is what we'll get, nor how we got what we got. When we take a risk, we are betting on an outcome that will result from a decision we have made, though we do not know for certain what the outcome will be. The essence of risk management lies in maximizing the areas where we have some control over the outcome while minimizing the areas where we have absolutely no control over the outcome and the linkage between effect and cause is hidden from us." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Risk management is the explicit quantitative declaration of uncertainty. But in some corporate cultures, people aren’t allowed to be uncertain. They’re allowed to be wrong, but they can’t be uncertain. They are obliged to look their bosses and clients in the face and lie rather than show uncertainty about outcomes. Uncertainty is for wimps." (Tom DeMarco, "Slack: Getting Past Burnout, Busywork, and the Myth of Total Efficiency", 2001)

"Risk mitigation is the set of actions you will take to reduce the impact of a risk should it materialize. There are two not-immediately-obvious aspects to risk mitigation: The plan has to precede materialization. Some of the mitigation activities must also precede materialization." (Tom DeMarco, "Slack: Getting Past Burnout, Busywork, and the Myth of Total Efficiency", 2001)

"According to the traditional distinction from economics, risk is measurable, whereas uncertainty is indefinite or incalculable. In truth, risk can never be measured precisely except in dice rolls and games of chance, called a priori probability. Risk can only be estimated from observations in the real world, but to do that, we need to take a sample, and estimate the underlying distribution. In a sense, our estimates of real-world volatility are themselves volatile. Failure to realize this fundamental untidiness of the real world is called the ludic fallacy from the Latin for games. […] However, when the term risk measurement is used as opposed to risk estimation, a degree of precision is suggested that is unrealistic, and the choice of language suggests that we know more than we do. Even the language '​​​​​​risk management'​​​​​​ implies we can do more than we can." (Paul Gibbons, "The Science of Successful Organizational Change",  2015)

"Change strategy is, by this definition, the way a business (1) manages the portfolio of change to make sure that the parts deliver the whole business strategy, (2) creates the context for change, and (3) monitors change risk and change performance across the entire business." (Paul Gibbons, "The Science of Successful Organizational Change", 2015)

"After you think, you act. After you act, you learn. Make decisions, but decisions will have risks of mistakes. But make sure you avoid disastrous mistakes and avoid making the same mistake twice." (Sukanto Tanoto, [Keynote speech] 2015)

"Governance and leadership are the yin and the yang of successful organisations. If you have leadership without governance you risk tyranny, fraud and personal fiefdoms. If you have governance without leadership you risk atrophy, bureaucracy and indifference." (Mark Goyder, "What Matters in Corporate Governance?", 2015)

"Our minds, especially our intuitions, are not equipped to deal with a probabilistic world. Risk and prediction are widely misunderstood, […] All decision making in a probabilistic world involves estimating the likelihood of an event and how much we will value it (affective forecasting). Humans are bad at both - ​​​​​ particularly at the former. […] In business, understanding the psychology of risk is more important than understanding the mathematics of risk." (Paul Gibbons, "The Science of Successful Organizational Change",  2015)

"Often greater risk is involved in postponement than in making a wrong decision." (Harry A Hopf)

28 December 2013

🚧Project Management: Risk (Just the Quotes)

"But the greater the primary risk, the safer and more careful your secondary assumptions must be. A project is only as sound as its weakest assumption, or its largest uncertainty." (Robert Heller, "The Naked Manager: Games Executives Play", 1972)

"Today, most project management practitioners focus on planning failure. If this aspect of the project can be compressed, or even eliminated, then the magnitude of the actual failure, should it occur, would be diminished. A good project management methodology helps to reduce planning failure. Today, we believe that planning failure, when it occurs, is due in large part to the project manager’s inability to perform effective risk management." (Harold Kerzner, "Strategic Planning for Project Management using a Project Management Maturity Model", 2001)

"Risks and benefits always go hand in hand. The reason that a project is full of risk is that it leads you into uncharted waters. It stretches your capability, which means that if you pull it off successfully, it's going to drive your competition batty. The ultimate coup is to stretch your own capability to a point beyond the competition's ability to respond. This is what gives you competitive advantage and helps you build a distinct brand in the market." (Tom DeMarco & Timothy Lister, "Waltzing with Bears: Managing Risk on Software Projects", 2003)

"The business of believing only what you have a right to believe is called risk management." (Tom DeMarco & Timothy Lister, "Waltzing with Bears: Managing Risk on Software Projects", 2003)

"In project management there are two levels of opportunities and risks. Because a project is the pursuit of an opportunity, the first category, the macro opportunity, is the project opportunity itself. The approach to achieving the project opportunity and the mitigation of associated project-level risks are structured into the strategy and tactics of the project cycle, the selected decision gates, the teaming arrangements, key personnel selected, and so on. The second level encompasses the tactical opportunities and risks within the project that become apparent at lower levels of decomposition and as project cycle phases are planned and executed. This can include emerging, unproven technology; incremental and evolutionary methods that promise high returns; and the temptation to circumvent proven practices in order to deliver better, faster, and cheaper." (Kevin Forsberg et al, "Visualizing Project Management: Models and frameworks for mastering complex systems" 3rd Ed., 2005)

"Opportunities and risks are endemic to the project environment. However well planned a project may be, there will always be residual project risk." (Kevin Forsberg et al, "Visualizing Project Management: Models and frameworks for mastering complex systems" 3rd Ed., 2005)

"When we pursue opportunity, we normally incur risk. The opportunity to experience the thrill of an exciting sport like hang gliding or scuba diving brings with it the attendant risks. Many people instinctively make the trade that the thrill is worth the risks. Others decline." (Kevin Forsberg et al, "Visualizing Project Management: Models and frameworks for mastering complex systems" 3rd Ed., 2005)

"For most projects there will be many sources of risk. Assumptions that seem quite reasonable at the start of a project may be proven otherwise if and when conditions in internal or external environments change during the project duration." (Roger Jones & Neil Murra, "Change, Strategy and Projects at Work", 2008)

"Routine tasks are, by their nature, familiar to us. The outcomes of performing routine tasks are therefore usually highly predictable. Project work by contrast includes elements of risk and uncertainty associated with the uniqueness and unfamiliarity of some of the work or the context in which it is carried out. Murphy’s Law expresses a ‘tongue-in-cheek’ but fallacious certainty of things going wrong, if it is possible for them to go wrong." (Roger Jones & Neil Murra, "Change, Strategy and Projects at Work", 2008)

"Whilst culture can help create a sense of belonging and shared destiny, it can also prove to be an obstacle to change especially where the existing culture is risk averse or if the change strategy is perceived by some to challenge prevailing group values. Where radical change is proposed, the achievement of cultural change may actually be a major objective of the proposed change." (Roger Jones & Neil Murra, "Change, Strategy and Projects at Work", 2008)

"A project is usually considered a failure if it is late, is over budget, or does not meet the customer’s expectations. Without the control that project management provides, a project is more likely to have problems with one of these areas. A problem with only one constraint (scope, schedule, cost, resources, quality, and risk) can jeopardize the entire project." (Sandra F Rowe, "Project Management for Small Projects" 3rd Ed., 2020)

27 November 2007

🏗️Software Engineering: Risks (Just the Quotes)

"The major distinguishing feature of the spiral model is that it creates a risk-driven approach to the software process rather than a primarily document-driven or code-driven process. It incorporates many of the strengths of other models and resolves many of their difficulties." (Barry Boehm, "A spiral model of software development and enhancement", IEEE, 1988)

"Refactoring is risky. It requires changes to working code that can introduce subtle bugs. Refactoring, if not done properly, can set you back days, even weeks. And refactoring becomes riskier when practiced informally or ad hoc." (Erich Gamma, 2002)

"The business of believing only what you have a right to believe is called risk management." (Tom DeMarco & Timothy Lister, "Waltzing with Bears: Managing Risk on Software Projects", 2003)

"Developing fewer features allows you to conserve development resources and spend more time refining those features that users really need. Fewer features mean fewer things to confuse users, less risk of user errors, less description and documentation, and therefore simpler Help content. Removing any one feature automatically increases the usability of the remaining ones." (Jakob Nielsen, "Prioritizing Web Usability", 2006)

"Duplication is the primary enemy of a well-designed system. It represents additional work, additional risk, and additional unnecessary complexity."  (Robert C Martin, "Clean Code: A Handbook of Agile Software Craftsmanship", 2008)

"Modeling is the creation of abstractions or representations of the system to predict and analyze performance, costs, schedules, and risks and to provide guidelines for systems research, development, design, manufacture, and management. Modeling is the centerpiece of systems architecting - a mechanism of communication to clients and builders, of design management with engineers and designers, of maintaining system integrity with project management, and of learning for the architect, personally."  (Mark W Maier, "The Art Systems of Architecting" 3rd Ed., 2009)

"In essence, Continuous Integration is about reducing risk by providing faster feedback. First and foremost, it is designed to help identify and fix integration and regression issues faster, resulting in smoother, quicker delivery, and fewer bugs. By providing better visibility for both technical and non-technical team members on the state of the project, Continuous Integration can open and facilitate communication channels between team members and encourage collaborative problem solving and process improvement. And, by automating the deployment process, Continuous Integration helps you get your software into the hands of the testers and the end users faster, more reliably, and with less effort." (John F Smart, "Jenkins: The Definitive Guide", 2011)

"Systems with high risks must be tested more thoroughly than systems that do not generate big losses if they fail. The risk assessment must be done for the individual system parts, or even for single error possibilities. If there is a high risk for failures by a system or subsystem, there must be a greater testing effort than for less critical (sub)systems. International standards for production of safety-critical systems use this approach to require that different test techniques be applied for software of different integrity levels." (Andreas Spillner et al, "Software Testing Foundations: A Study Guide for the Certified Tester Exam" 4th Ed., 2014)

"Sometimes you can’t fit everything in. Remember that the sprint is great for testing risky solutions that might have a huge payoff. So you’ll have to reverse the way you would normally prioritize. If a small fix is so good and low-risk that you’re already planning to build it next week, then seeing it in a prototype won’t teach you much. Skip those easy wins in favor of big, bold bets." (Jake Knapp et al, "Sprint: How to Solve Big Problems and Test New Ideas in Just Five Days", 2016)

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.