19 June 2006

✒️Stephen G Haines - Collected Quotes

"Delay time, the time between causes and their impacts, can highly influence systems. Yet the concept of delayed effect is often missed in our impatient society, and when it is recognized, it’s almost always underestimated. Such oversight and devaluation can lead to poor decision making as well as poor problem solving, for decisions often have consequences that don’t show up until years later. Fortunately, mind mapping, fishbone diagrams, and creativity/brainstorming tools can be quite useful here." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"Our simplistic cause-effect analyses, especially when coupled with the desire for quick fixes, usually lead to far more problems than they solve - impatience and knee-jerk reactions included. If we stop for a moment and take a good look our world and its seven levels of complex and interdependent systems, we begin to understand that multiple causes with multiple effects are the true reality, as are circles of causality-effects." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"Strategic planning and strategic change management are really 'strategic thinking'. It’s about clarity and simplicity, meaning and purpose, and focus and direction." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Systems thinking is based on the theory that a system is, in essence, circular. Using a systems approach in your strategic management, therefore, provides a circular implementing structure that can evolve, with continuously improving, self-checking, and learning capabilities [...]" (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"The systems approach, on the other hand, provides an expanded structural design of organizations as living systems that more accurately reflects reality." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"This is what systems thinking is all about: the idea of building an organization in which each piece, and partial solution of the organization has the fit, alignment, and integrity with your overall organization as a system, and its outcome of serving the customer." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"True systems thinking, on the other hand, studies each problem as it relates to the organization’s objectives and interaction with its entire environment, looking at it as a whole within its universe. Taking your organization from a partial systems to a true systems state requires effective strategic management and backward thinking." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

17 June 2006

✒️Yaneer Bar-Yamm - Collected Quotes

"A complex system is a system formed out of many components whose behavior is emergent, that is, the behavior of the system cannot be simply inferred from the behavior of its components. The amount of information necessary to describe the behavior of such a system is a measure of its complexity." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"A dictionary definition of the word ‘complex’ is: ‘consisting of interconnected or interwoven parts’ […] Loosely speaking, the complexity of a system is the amount of information needed in order to describe it. The complexity depends on the level of detail required in the description. A more formal definition can be understood in a simple way. If we have a system that could have many possible states, but we would like to specify which state it is actually in, then the number of binary digits (bits) we need to specify this particular state is related to the number of states that are possible." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"Many of the systems that surround us are complex. The goal of understanding their properties motivates much if not all of scientific inquiry. […] all scientific endeavor is based, to a greater or lesser degree, on the existence of universality, which manifests itself in diverse ways. In this context, the study of complex systems as a new endeavor strives to increase our ability to understand the universality that arises when systems are highly complex." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"There are two approaches to organizing the properties of complex systems that will serve as the foundation of our discussions. The first of these is the relationship between elements, parts and the whole. Since there is only one property of the complex system that we know for sure - that it is complex - the primary question we can ask about this relationship is how the complexity of the whole is related to the complexity of the parts. […] The second approach to the study of complex systems begins from an understanding of the relationship of systems to their descriptions. The central issue is defining quantitatively what we mean by complexity."  (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"When the behavior of the system depends on the behavior of the parts, the complexity of the whole must involve a description of the parts, thus it is large. The smaller the parts that must be described to describe the behavior of the whole, the larger the complexity of the entire system. […] A complex system is a system formed out of many components whose behavior is emergent, that is, the behavior of the system cannot be simply inferred from the behavior of its components." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"A fundamental reason for the difficulties with modern engineering projects is their inherent complexity. The systems that these projects are working with or building have many interdependent parts, so that changes in one part often have effects on other parts of the system. These indirect effects are frequently unanticipated, as are collective behaviors that arise from the mutual interactions of multiple components. Both indirect and collective effects readily cause intolerable failures of the system. Moreover, when the task of the system is intrinsically complex, anticipating the many possible demands that can be placed upon the system, and designing a system that can respond in all of the necessary ways, is not feasible. This problem appears in the form of inadequate specifications, but the fundamental issue is whether it is even possible to generate adequate specifications for a complex system." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"Complex problems are the problems that persist - the problems that bounce back and continue to haunt us. People often go through a series of stages in dealing with such problems - from believing they are beyond hope, to galvanizing collective efforts of many people and dollars to address the problem, to despair, retreat, and rationalization." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"Emergence refers to the relationship between the details of a system and the larger view. Emergence does not emphasize the primary importance of the details or of the larger view; it is concerned with the relationship between the two. Specifically, emergence seeks to discover: Which details are important for the larger view, and which are not? How do collective properties arise from the properties of parts? How does behavior at a larger scale of the system arise from the detailed structure, behavior and relationships on a finer scale?" (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"Engineers use abstraction to simplify the description or specification of the system, extracting the properties of the system they find most relevant and ignoring other details. While this is a useful tool, it assumes that the details that will be provided to one part of the system (module) can be designed independently of details in other parts."  (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"Modularity, an approach that separates a large system into simpler parts that are individually designed and operated, incorrectly assumes that complex system behavior can essentially be reduced to the sum of its parts. A planned decomposition of a system into modules works well for systems that are not too complex. […] However, as systems become more complex, this approach forces engineers to devote increasing attention to designing the interfaces between parts, eventually causing the process to break down."  (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"The basic idea of systems engineering is that it is possible to take a large and highly complex system that one wants to build, separate it into key parts, give the parts to different groups of people to work on, and coordinate their development so that they can be put together at the end of the process. This mechanism is designed to be applied recursively, so that we separate the large system into parts, then the parts into smaller parts, until each part is small enough for one person to execute. Then we put all of the parts together until the entire system works." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"The collapse of a particular project may appear to have a specific cause, but an overly high intrinsic complexity of these systems is a problem common to many of them. A chain always breaks first in one particular link, but if the weight it is required to hold is too high, failure of the chain is guaranteed." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"The complexity of each individual or organization must match the complexity of the task each is to perform. When we think about a highly complex problem, we are generally thinking about tasks that are more complex than a single individual can understand. Otherwise, complexity is not the main issue in solving it. If a problem is more complex than a single individual, the only way to solve it is to have a group of people - organized appropriately - solve it together. When an organization is highly complex it can only function by making sure that each individual does not have to face the complexity of the task of the organization as a whole. Otherwise failure will occur most of the time." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"The complexity of engineering projects has been increasing, but this is not to say that this complexity is new. Engineers and managers are generally aware of the complexity of these projects and have developed systematic techniques that are often useful in addressing it. Notions like modularity, abstraction, hierarchy and layering allow engineers to usefully analyze the complex systems they are working with. At a certain level of interdependence, though, these standard approaches become ineffective." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"The most basic issue for organizational success is correctly matching a system’s complexity to its environment. When we want to accomplish a task, the complexity of the system performing that task must match the complexity of the task. In order to perform the matching correctly, one must recognize that each person has a limited level of complexity. Therefore, tasks become difficult because the complexity of a person is not large enough to handle the complexity of the task. The trick then is to distribute the complexity of the task among many individuals." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"There is a dual nature to engineering. Engineers are responsible for careful quantitative evaluation of how to achieve objectives, what to do to achieve them, and even (a task that most people find almost impossible) how long it will take to do the task. The other side of engineering is an independent creative 'cowboy'-type attitude characteristic of people breaking out of the mold, coming up with novel ideas, implementing them, and changing the world through new technology. This is the culture of high-tech innovation." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"There is no doubt that science has made great progress by taking things apart, but it has become increasingly clear that many important questions can only be addressed by thinking more carefully about relationships between and amongst the parts. Indeed, one of the main difficulties in answering questions or solving problems - any kind of problem - is that we think the problem is in the parts, when it is really in the relationships between them." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"What do people do today when they don’t understand 'the system'? They try to assign responsibility to someone to fix the problem, to oversee 'the system', to coordinate and control what is happening. It is time we recognized that 'the system' is how we work together. When we don’t work together effectively putting someone in charge by its very nature often makes things worse, rather than better, because no one person can understand 'the system' well enough to be responsible. We need to learn how to improve the way we work together, to improve 'the system' without putting someone in charge, in order to make things work." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"What is the solution to coordinating people to perform complex tasks? Analyzing the flows of information and the way tasks are distributed through the system can help. ultimately, however, the best solution is to create an environment where evolution can take place. Organizations that learn by evolutionary change create an environment of ongoing innovation. Evolution by competition and cooperation and the creation of composites of patterns of behavior is the way to synthesize effective systems to meet the complex challenges of today’s world." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"When parts are acting independently, the fine scale behavior is more complex. When they are working together, the fine scale complexity is much smaller, but the behavior is on a larger scale. This means that complexity is always a trade-off, more complex at a large scale means simpler at a fine scale. This trade-off is a basic conceptual tool that we need in order to understand complex systems." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004) 

✒️Fernando J Corbató - Collected Quotes

"Systems with unknown behavioral properties require the implementation of iterations which are intrinsic to the design process but which are normally hidden from view. Certainly when a solution to a well-understood problem is synthesized, weak designs are mentally rejected by a competent designer in a matter of moments. On larger or more complicated efforts, alternative designs must be explicitly and iteratively implemented. The designers perhaps out of vanity, often are at pains to hide the many versions which were abandoned and if absolute failure occurs, of course one hears nothing. Thus the topic of design iteration is rarely discussed. Perhaps we should not be surprised to see this phenomenon with software, for it is a rare author indeed who publicizes the amount of editing or the number of drafts he took to produce a manuscript." (Fernando J Corbató, "A Managerial View of the Multics System Development", 1977)

"Because one has to be an optimist to begin an ambitious project, it is not surprising that underestimation of completion time is the norm." (Fernando J Corbató, "On Building Systems That Will Fail", 1991)

"Design bugs are often subtle and occur by evolution with early assumptions being forgotten as new features or uses are added to systems." (Fernando J Corbató, "On Building Systems That Will Fail", 1991)

"It is important to emphasize the value of simplicity and elegance, for complexity has a way of compounding difficulties and as we have seen, creating mistakes. My definition of elegance is the achievement of a given functionality with a minimum of mechanism and a maximum of clarity." (Fernando J Corbató, "On Building Systems That Will Fail", 1991)

"The value of metaphors should not be underestimated. Metaphors have the virtue of an expected behavior that is understood by all. Unnecessary communication and misunderstandings are reduced. Learning and education are quicker. In effect metaphors are a way of internalizing and abstracting concepts allowing one's thinking to be on a higher plane and low-level mistakes to be avoided." (Fernando J Corbató, "On Building Systems That Will Fail", 1991)

16 June 2006

✒️Karl E Weick - Collected Quotes

"If all of the elements in a large system are loosely coupled to one another, then any one element can adjust to and modify a local a local unique contingency without affecting the whole system. These local adaptations can be swift, relatively economical, and substantial." (Karl E Weick, "Educational organizations as loosely coupled systems", 1976)

"In a loosely coupled system there is more room available for self-determination by the actors. If it is argued that a sense of efficacy is crucial for human beings. when a sense of efficacy might be greater in a loosely coupled system with autonomous units than it would be in a tightly coupled system where discretion is limited." (Karl E Weick, "Educational organizations as loosely coupled systems", 1976)

"Managers construct, rearrange, single out, and demolish many objective features of their surroundings. When people act they unrandomize variables, insert vestiges of orderliness, and literally create their own constraints." (Karl E Weick, "Social Psychology of Organizing", 1979)

"The typical coupling mechanisms of authority of office and logic of the task do not operate in educational organizations." (Karl E Weick, "Educational organizations as loosely coupled systems", 1976)

"Any approach to the study of organizations is built on specific assumptions about the nature of organizations and how they are designed and function." (Richard L Daft & Karl E Weick, "Toward a model of organizations as interpretation systems", Academy of Management Review Vol 9 (2), 1984)

"Action often creates the orderly relations that originally were mere presumptions summarized in a cause map. Thus language trappings of organizations such as strategic plans are important components in the process of creating order. They hold events together long enough and tightly enough in people's heads so that they act in the belief that their actions will be influential and make sense." (Karl E. Weick, "Organizational culture as a source of high reliability", 1987)

"An ordered set of assertions about a generic behavior or structure assumed to hold throughout a significantly broad range of specific instances." (Karl E Weick, "Theory construction as disciplined imagination", 1989)

"Experience is the consequence of activity. The manager literally wades into the swarm of 'events' that surround him and actively tries to unrandomize them and impose some order: The manager acts physically in the environment, attends to some of it, ignores most of it, talks to other people about what they see and are doing."  (Karl E Weick, "Sensemaking in Organizations", 1995)

"Organizations are presumed to talk to themselves over and over to find out what they are thinking." (Karl E Weick, "Sensemaking in Organizations", 1995)

"Sensemaking is about the enlargement of small cues. It is a search for contexts within which small details fit together and make sense. It is people interacting to flesh out hunches. It is a continuous alternation between particulars and explanations with each cycle giving added form and substance to the other." (Karl E Weick, "Sensemaking in Organizations", 1995)

"Sensemaking tends to be swift, which means we are more likely to see products than processes." (Karl E Weick, Sensemaking in Organizations, 1995)

"The organism or group enacts equivocal raw talk, the talk is viewed retrospectively, sense is made of it, and then this sense is stored as knowledge in the retention process. The aim of each process has been to reduce equivocality and to get some idea of what has occurred." (Karl E Weick, "Sensemaking in Organizations", 1995)

"The point we want to make here is that sensemaking is about plausibility, coherence, and reasonableness. Sensemaking is about accounts that are socially acceptable and credible... It would be nice if these accounts were also accurate. But in an equivocal, postmodern world, infused with the politics of interpretation and conflicting interests and inhabited by people with multiple shifting identities, an obsession with accuracy seems fruitless, and not of much practical help, either." (Karl E Weick, "Sensemaking in Organizations", 1995)

"To talk about sensemaking is to talk about reality as an ongoing accomplishment that takes form when people make retrospective sense of the situations in which they find themselves and their creations. There is a strong reflexive quality to this process. People make sense of things by seeing a world on which they already imposed what they believe. In other words, people discover their own inventions. This is why sensemaking can be understood as invention and interpretations understood as discovery. These are complementary ideas. If sensemaking is viewed as an act of invention, then it is also possible to argue that the artifacts it produces include language games and texts." (Karl E Weick, "Sensemaking in Organizations", 1995)

"When people perform an organized action sequence and are interrupted, they try to make sense of it. The longer they search, the higher the arousal, and the stronger the emotion. If the interruption slows the accomplishment of an organized sequence, people are likely to experience anger. If the interruption has accelerated accomplishment, then they are likely to experience pleasure. If people find that the interruption can be circumvented, they experience relief. If they find that the interruption has thwarted a higher level plan, then anger is likely to turn into rage, and if they find that the interruption has thwarted a minor behavioural sequence, they are likely to feel irritated." (Karl E Weick, "Sensemaking in Organizations", 1995)

"The basic idea of sensemaking is that reality is an ongoing accomplishment that emerges from efforts to create order and make retrospective sense of what occurs." (Karl E Weick, "The collapse of sensemaking in organizations: The Mann Gulch disaster", Administrative Science Quarterly 3, 1993)

14 June 2006

✒️Malcolm Gladwell - Collected Quotes

"That is the paradox of the epidemic: that in order to create one contagious movement, you often have to create many small movements first." (Malcolm Gladwell, "The Tipping Point: How Little Things Can Make a Big Difference", 2000)

"The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire." (Malcolm Gladwell, "The Tipping Point: How Little Things Can Make a Big Difference", 2000)

"Every moment – every blink – is composed of a series of discrete moving parts, and every one of those parts offers an opportunity for intervention, for reform, and for correction." (Malcolm Gladwell, "Blink: The Power of Thinking Without Thinking", 2008)

"Often a sign of expertise is noticing what doesn't happen." (Malcolm Gladwell, "Blink: The Power of Thinking Without Thinking", 2008)

"Taking our powers of rapid cognition seriously means we have to acknowledge the subtle influences that can alter or undermine or bias the products of our unconscious." (Malcolm Gladwell, "Blink: The Power of Thinking Without Thinking", 2008)

"The key to good decision making is not knowledge. It is understanding. We are swimming in the former. We are desperately lacking in the latter." (Malcolm Gladwell, "Blink: The Power of Thinking Without Thinking", 2008)

"The values of the world we inhabit and the people we surround ourselves with have a profound effect on who we are." (Malcolm Gladwell, "Outliers: The Story of Success", 2008)

"Those three things - autonomy, complexity, and a connection between effort and reward - are, most people will agree, the three qualities that work has to have if it is to be satisfying." (Malcolm Gladwell, "Outliers: The Story of Success", 2008)

"Truly successful decision-making relies on a balance between deliberate and instinctive thinking." (Malcolm Gladwell, "Blink: The Power of Thinking Without Thinking", 2008)

"We learn by example and by direct experience because there are real limits to the adequacy of verbal instruction." (Malcolm Gladwell, "Blink: The Power of Thinking Without Thinking", 2008)


13 June 2006

✒️Carlos Gershenson - Collected Quotes

"Self-organization can be seen as a spontaneous coordination of the interactions between the components of the system, so as to maximize their synergy. This requires the propagation and processing of information, as different components perceive different aspects of the situation, while their shared goal requires this information to be integrated. The resulting process is characterized by distributed cognition: different components participate in different ways to the overall gathering and processing of information, thus collectively solving the problems posed by any perceived deviation between the present situation and the desired situation." (Carlos Gershenson & Francis Heylighen, "How can we think the complex?", 2004)

[synergy:] "Measure describing how one agent or system increases the satisfaction of other agents or systems." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"The second law of thermodynamics states that in an isolated system, entropy can only increase, not decrease. Such systems evolve to their state of maximum entropy, or thermodynamic equilibrium. Therefore, physical self-organizing systems cannot be isolated: they require a constant input of matter or energy with low entropy, getting rid of the internally generated entropy through the output of heat ('dissipation'). This allows them to produce ‘dissipative structures’ which maintain far from thermodynamic equilibrium. Life is a clear example of order far from thermodynamic equilibrium." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Thus, nonlinearity can be understood as the effect of a causal loop, where effects or outputs are fed back into the causes or inputs of the process. Complex systems are characterized by networks of such causal loops. In a complex, the interdependencies are such that a component A will affect a component B, but B will in general also affect A, directly or indirectly.  A single feedback loop can be positive or negative. A positive feedback will amplify any variation in A, making it grow exponentially. The result is that the tiniest, microscopic difference between initial states can grow into macroscopically observable distinctions." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007) 

"To develop a Control, the designer should find aspect systems, subsystems, or constraints that will prevent the negative interferences between elements (friction) and promote positive interferences (synergy). In other words, the designer should search for ways of minimizing frictions that will result in maximization of the global satisfaction" (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"We have to be aware that even in mathematical and physical models of self-organizing systems, it is the observer who ascribes properties, aspects, states, and probabilities; and therefore entropy or order to the system. But organization is more than low entropy: it is structure that has a function or purpose." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Complexity carries with it a lack of predictability different to that of chaotic systems, i.e. sensitivity to initial conditions. In the case of complexity, the lack of predictability is due to relevant interactions and novel information created by them." (Carlos Gershenson, "Understanding Complex Systems", 2011)

"Complexity has shown that reductionism is limited, in the sense that emergent properties cannot be reduced. In other words, the properties at a given scale cannot be always described completely in terms of properties at a lower scale. This has led people to debate on the reality of phenomena at different scales." (Carlos Gershenson, "Complexity", 2011)

09 June 2006

✒️Scott E Page - Collected Quotes

"Effective models require a real world that has enough structure so that some of the details can be ignored. This implies the existence of solid and stable building blocks that encapsulate key parts of the real system’s behavior. Such building blocks provide enough separation from details to allow modeling to proceed." (John H Miller & Scott E Page, "Complex Adaptive Systems: An Introduction to Computational Models of Social Life", 2007)

"Models need to be judged by what they eliminate as much as by what they include - like stone carving, the art is in removing what you do not need." (John H Miller & Scott E Page, "Complex Adaptive Systems: An Introduction to Computational Models of Social Life", 2007)

"A heuristic is a rule applied to an existing solution represented in a perspective that generates a new (and hopefully better) solution or a new set of possible solutions." (Scott E Page, "The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools and Societies", 2008)

"A perspective is a map from reality to an internal language such that each distinct object, situation, problem, or event gets mapped to a unique word." (Scott E Page, "The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools and Societies", 2008)

"[...] diverse, connected, interdependent entities whose behavior is determined by rules, which may adapt, but need not. The interactions of these entities often produce phenomena that are more than the parts. These phenomena are called emergent." (Scott E Page, "Diversity and Complexity", 2010)

"If we can understand how to leverage diversity to achieve better performance and greater robustness, we might anticipate and prevent collapses." (Scott E Page, "Diversity and Complexity", 2010)

"[…] many-model thinking produces wisdom through a diverse ensemble of logical frames. The various models accentuate different causal forces. Their insights and implications overlap and interweave. By engaging many models as frames, we develop nuanced, deep understandings." (Scott E Page," The Model Thinker", 2018)

“Models are formal structures represented in mathematics and diagrams that help us to understand the world. Mastery of models improves your ability to reason, explain, design, communicate, act, predict, and explore.” (Scott E Page, “The Model Thinker”, 2018)

"Collective intelligence is where the whole is smarter than any one individual in it. You can think of it that in a predictive context, this could be the wisdom of crowds, sort of thing where people guessing the weight of a steer, the crowd’s guess is going to be better than the average guess of the person in it." (Scott E Page [interview])

"Diverse groups of problem solvers outperformed the groups of the best individuals at solving complex problems. The reason: the diverse groups got stuck less often than the smart individuals, who tended to think similarly." (Scott E Page)

✒️Fritjof Capra - Collected Quotes

"The new paradigm may be called a holistic world view, seeing the world as an integrated whole rather than a dissociated collection of parts. It may also be called an ecological view, if the term 'ecological' is used in a much broader and deeper sense than usual. Deep ecological awareness recognizes the fundamental interdependence of all phenomena and the fact that, as individuals and societies we are all embedded in (and ultimately dependent on) the cyclical process of nature." (Fritjof Capra & Gunter A Pauli, "Steering Business Toward Sustainability", 1995)

“[…] self-organization is the spontaneous emergence of new structures and new forms of behavior in open systems far from equilibrium, characterized by internal feedback loops and described mathematically by nonlinear equations.” (Fritjof  Capra, “The web of life: a new scientific understanding of living  systems”, 1996)

"The more complex the network is, the more complex its pattern of interconnections, the more resilient it will be." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"The more we study the major problems of our time, the more we come to realise that they cannot be understood in isolation. They are systemic problems, which means that they are interconnected and interdependent." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"Understanding ecological interdependence means understanding relationships. It requires the shifts of perception that are characteristic of systems thinking - from the parts to the whole, from objects to relationships, from contents to patterns. [...]  Nourishing the community means nourishing those relationships." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"What is sustained in a sustainable community is not economic growth, development, market share, or competitive advantage, but the entire web of life on which our long-term survival depends. In other words, a sustainable community is designed in such a way that its ways of life, businesses, economy, physical structures, and technologies do not interfere with nature’s inherent ability to sustain life." (Fritjof Capra, "Ecoliteracy: The Challenge for Education in the Next Century", 1999)

"One of the key insights of the systems approach has been the realization that the network is a pattern that is common to all life. Wherever we see life, we see networks." (Fritjof Capra, "The Hidden Connections: A Science for Sustainable Living", 2002)

"The phenomenon of emergence takes place at critical points of instability that arise from fluctuations in the environment, amplified by feedback loops." (Fritjof Capra, "The Hidden Connections: A Science for Sustainable Living", 2002)

"This spontaneous emergence of order at critical points of instability, which is often referred to simply as 'emergence', is one of the hallmarks of life. It has been recognized as the dynamic origin of development, learning, and evolution. In other words, creativity - the generation of new forms - is a key property of all living systems." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

08 June 2006

✒️Fred C Scweppe - Collected Quotes

 "A bias can be considered a limiting case of a nonwhite disturbance as a constant is the most time-correlated process possible." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Changes of variables can be helpful for iterative and parametric solutions even if they do not linearize the problem. For example, a change of variables may change the 'shape' of J(x) into a more suitable form. Unfortunately there seems to be no· general way to choose the 'right' change of variables. Success depends on the particular problem and the engineer's insight. However, the possibility of a change of variables should always be considered."(Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Decision-making problems (hypothesis testing) involve situations where it is desired to make a choice among various alternative decisions (hypotheses). Such problems can be viewed as generalized state estimation problems where the definition of state has simply been expanded." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Hypothesis testing can introduce the need for multiple models for the multiple hypotheses and,' if appropriate, a priori probabilities. The one modeling aspect of hypothesis testing that has no estimation counterpart is the problem of specifying the hypotheses to be considered. Often this is a critical step which influences both performance arid the difficulty of implementation." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Modeling is definitely the most important and critical problem. If the mathematical model is not valid, any subsequent analysis, estimation, or control study is meaningless. The development of the model in a convenient form can greatly reduce the complexity of the actual studies." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Pattern recognition can be viewed as a special case of hypothesis testing. In pattern recognition, an observation z is to be used to decide what pattern caused it. Each possible pattern can be viewed as one hypothesis. The main problem in pattern recognition is the development of models for the z corresponding to each pattern (hypothesis)." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"System theory is a tool which engineers use to help them design the 'best' system to do the job that must be done. A dominant characteristic of system theory is the interest in the analysis and design (synthesis) of systems from an input-output point of view. System theory uses mathematical manipulation of a mathematical model to help design the actual system." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The biggest (and sometimes insurmountable) problem is usually to use the available data (information, measurements, etc.) to find out what the system is actually doing (i.e., to estimate its state). If the system's state can be estimated to some reasonable accuracy, the desired control is often obvious (or can be obtained by the use of deterministic control theory)." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The choice of model is often the most critical aspect of a design and development engineering job, but it is impossible to give explicit rules or techniques." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The power and beauty of stochastic approximation theory is that it provides simple, easy to implement gain sequences which guarantee convergence without depending (explicitly) on knowledge of the function to be minimized or the noise properties. Unfortunately, convergence is usually extremely slow. This is to be expected, as "good performance" cannot be expected if no (or very little) knowledge of the nature of the problem is built into the algorithm. In other words, the strength of stochastic approximation (simplicity, little a priori knowledge) is also its weakness." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The pseudo approach to uncertainty modeling refers to the use of an uncertainty model instead of using a deterministic model which is actually (or at least theoretically) available. The uncertainty model may be desired because it results in a simpler analysis, because it is too difficult (expensive) to gather all the data necessary for an exact model, or because the exact model is too complex to be included in the computer." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"[A] system is represented by a mathematical model which may take many forms, such as algebraic equations, finite state machines, difference equations, ordinary differential equations, partial differential equations, and functional equations. The system model may be uncertain, as the mathematical model may not be known completely." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The term hypothesis testing arises because the choice as to which process is observed is based on hypothesized models. Thus hypothesis testing could also be called model testing. Hypothesis testing is sometimes called decision theory. The detection theory of communication theory is a special case." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

07 June 2006

✒️F David Peat - Collected Quotes

"A model is a simplified picture of physical reality; one in which, for example, certain contingencies such as friction, air resistance, and so on have been neglected. This model reproduces within itself some essential feature of the universe. While everyday events in nature are highly contingent and depend upon all sorts of external perturbations and contexts, the idealized model aims to produce the essence of phenomena." (F David Peat, "From Certainty to Uncertainty", 2002)

"A system at a bifurcation point, when pushed slightly, may begin to oscillate. Or the system may flutter around for a time and then revert to its normal, stable behavior. Or, alternatively it may move into chaos. Knowing a system within one range of circumstances may offer no clue as to how it will react in others. Nonlinear systems always hold surprises." (F David Peat, "From Certainty to Uncertainty", 2002)

"A theory makes certain predictions and allows calculations to be made that can be tested directly through experiments and observations. But a theory such as superstrings talks about quantum objects that exist in a multidimensional space and at incredibly short distances. Other grand unified theories would require energies close to those experienced during the creation of the universe to test their predictions." (F David Peat, "From Certainty to Uncertainty", 2002)

"Although the detailed moment-to-moment behavior of a chaotic system cannot be predicted, the overall pattern of its 'random' fluctuations may be similar from scale to scale. Likewise, while the fine details of a chaotic system cannot be predicted one can know a little bit about the range of its 'random' fluctuation." (F David Peat, "From Certainty to Uncertainty", 2002)

"An algorithm is a simple rule, or elementary task, that is repeated over and over again. In this way algorithms can produce structures of astounding complexity." (F David Peat, "From Certainty to Uncertainty", 2002)

"Chaos itself is one form of a wide range of behavior that extends from simple regular order to systems of incredible complexity. And just as a smoothly operating machine can become chaotic when pushed too hard (chaos out of order), it also turns out that chaotic systems can give birth to regular, ordered behavior (order out of chaos). […] Chaos and chance don’t mean the absence of law and order, but rather the presence of order so complex that it lies beyond our abilities to grasp and describe it." (F David Peat, "From Certainty to Uncertainty", 2002)

"Chaos theory explains the ways in which natural and social systems organize themselves into stable entities that have the ability to resist small disturbances and perturbations. It also shows that when you push such a system too far it becomes balanced on a metaphoric knife-edge. Step back and it remains stable; give it the slightest nudge and it will move into a radically new form of behavior such as chaos." (F David Peat, "From Certainty to Uncertainty", 2002)

"Lessons from chaos theory show that energy is always needed for reorganization. And for a new order to appear an organization must be willing to allow a measure of chaos to occur; chaos being that which no one can totally control. It means entering a zone where no one can predict the final outcome or be truly confident as to what will happen." (F David Peat, "From Certainty to Uncertainty", 2002)

"The theories of science are all about idealized models and, in turn, these models give pictures of reality. […] But when we speak of the quantum world we find we are employing concepts that simply do not fit. When we discuss our models of reality we are continually importing ideas that are inappropriate and have no real meaning in the quantum domain." (F David Peat, "From Certainty to Uncertainty", 2002)

"There are endless examples of elaborate structures and apparently complex processes being generated through simple repetitive rules, all of which can be easily simulated on a computer. It is therefore tempting to believe that, because many complex patterns can be generated out of a simple algorithmic rule, all complexity is created in this way." (F David Peat, "From Certainty to Uncertainty", 2002)

"[…] while chaos theory deals in regions of randomness and chance, its equations are entirely deterministic. Plug in the relevant numbers and out comes the answer. In principle at least, dealing with a chaotic system is no different from predicting the fall of an apple or sending a rocket to the moon. In each case deterministic laws govern the system. This is where the chance of chaos differs from the chance that is inherent in quantum theory." (F David Peat, "From Certainty to Uncertainty", 2002)

"While chaos theory is, in the last analysis, no more than a metaphor for human society, it can be a valuable metaphor. It makes us sensitive to the types of organizations we create and the way we deal with the situations that surround us." (F David Peat, "From Certainty to Uncertainty", 2002)

✒️Kevin Kelly - Collected Quotes

"A network nurtures small failures in order that large failures don't happen as often." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"An event is not triggered by a chain of being, but by a field of causes spreading horizontally, like creeping tide." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Artificial complex systems will be deliberately infused with organic principles simply to keep them going." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"At the other far extreme, we find many systems ordered as a patchwork of parallel operations, very much as in the neural network of a brain or in a colony of ants. Action in these systems proceeds in a messy cascade of interdependent events. Instead of the discrete ticks of cause and effect that run a clock, a thousand clock springs try to simultaneously run a parallel system. Since there is no chain of command, the particular action of any single spring diffuses into the whole, making it easier for the sum of the whole to overwhelm the parts of the whole. What emerges from the collective is not a series of critical individual actions but a multitude of simultaneous actions whose collective pattern is far more important. This is the swarm model." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Complexity must be grown from simple systems that already work." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Dumb parts, properly constituted into a swarm, yield smart results." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Evolution is a technological, mathematical, informational, and biological process rolled into one. It could almost be said to be a law of physics, a principle that reigns over all created multitudes, whether they have genes or not." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"If machines knew as much about each other as we know about each other (even in our privacy), the ecology of machines would be indomitable." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"It has long been appreciated by science that large numbers behave differently than small numbers. Mobs breed a requisite measure of complexity for emergent entities. The total number of possible interactions between two or more members accumulates exponentially as the number of members increases. At a high level of connectivity, and a high number of members, the dynamics of mobs takes hold. " (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Swarm systems generate novelty for three reasons: (1) They are 'sensitive to initial conditions' - a scientific shorthand for saying that the size of the effect is not proportional to the size of the cause - so they can make a surprising mountain out of a molehill. (2) They hide countless novel possibilities in the exponential combinations of many interlinked individuals. (3) They don’t reckon individuals, so therefore individual variation and imperfection can be allowed. In swarm systems with heritability, individual variation and imperfection will lead to perpetual novelty, or what we call evolution." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"The central act of the coming era is to connect everything to everything." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"The hardest lesson for humans to learn: that organic complexity will entail organic time." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"The only organization capable of unprejudiced growth, or unguided learning, is a network. All other topologies limit what can happen." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"The work of managing a natural environment is inescapably a work of local knowledge." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995) 

"The world of our own making has become so complicated that we must turn to the world of the born to understand how to manage it." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"To err is human; to manage error is system." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"When everything is connected to everything in a distributed network, everything happens at once. When everything happens at once, wide and fast moving problems simply route around any central authority. Therefore overall governance must arise from the most humble interdependent acts done locally in parallel, and not from a central command. A mob can steer itself, and in the territory of rapid, massive, and heterogeneous change, only a mob can steer. To get something from nothing, control must rest at the bottom within simplicity. " (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"A standalone object, no matter how well designed, has limited potential for new weirdness. A connected object, one that is a node in a network that interacts in some way with other nodes, can give birth to a hundred unique relationships that it never could do while unconnected. Out of this tangle of possible links come myriad new niches for innovations and interactions." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"All things being equal, choose technology that connects. […] This aspect of technology has increasing importance, at times overshadowing such standbys as speed and price. If you are in doubt about what technology to purchase, get the stuff that will connect the most widely, the most often, and in the most ways. Avoid anything that resembles an island, no matter how well endowed that island is." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998) 

"Any network has two ingredients: nodes and connections. In the grand network we are now assembling, the size of the nodes is collapsing while the quantity and quality of the connections are exploding." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"At present, there is far more to be gained by pushing the boundaries of what can be done by the bottom than by focusing on what can be done at the top. When it comes to control, there is plenty of room at the bottom. What we are discovering is that peer-based networks with millions of parts, minimal oversight, and maximum connection among them can do far more than anyone ever expected. We don’t yet know what the limits of decentralization are." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"Don’t solve problems; pursue opportunities. […] In both the short and long term, our ability to solve social and economic problems will be limited primarily to our lack of imagination in seizing opportunities, rather than trying to optimize solutions. There is more to be gained by producing more opportunities than by optimizing existing ones." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998) 

"Mathematics says the sum value of a network increases as the square of the number of members. In other words, as the number of nodes in a network increases arithmetically, the value of the network increases exponentially. Adding a few more members can dramatically increase the value for all members." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"Networks have existed in every economy. What’s different now is that networks, enhanced and multiplied by technology, penetrate our lives so deeply that 'network' has become the central metaphor around which our thinking and our economy are organized. Unless we can understand the distinctive logic of networks, we can’t profit from the economic transformation now under way." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"Technology is no panacea. It will never solve the ills or injustices of society. Technology can do only one thing for us - but it is an astonishing thing: Technology brings us an increase in opportunities." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"The internet model has many lessons for the new economy but perhaps the most important is its embrace of dumb swarm power. The aim of swarm power is superior performance in a turbulent environment. When things happen fast and furious, they tend to route around central control. By interlinking many simple parts into a loose confederation, control devolves from the center to the lowest or outermost points, which collectively keep things on course. A successful system, though, requires more than simply relinquishing control completely to the networked mob." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"The distinguishing characteristic of networks is that they contain no clear center and no clear outside boundaries. Within a network everything is potentially equidistant from everything else." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

05 June 2006

✒️John D Sterman - Collected Quotes

"Bounded rationality simultaneously constrains the complexity of our cognitive maps and our ability to use them to anticipate the system dynamics. Mental models in which the world is seen as a sequence of events and in which feedback, nonlinearity, time delays, and multiple consequences are lacking lead to poor performance when these elements of dynamic complexity are present. Dysfunction in complex systems can arise from the misperception of the feedback structure of the environment. But rich mental models that capture these sources of complexity cannot be used reliably to understand the dynamics. Dysfunction in complex systems can arise from faulty mental simulation-the misperception of feedback dynamics. These two different bounds on rationality must both be overcome for effective learning to occur. Perfect mental models without a simulation capability yield little insight; a calculus for reliable inferences about dynamics yields systematically erroneous results when applied to simplistic models." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Faced with the overwhelming complexity of the real world, time pressure, and limited cognitive capabilities, we are forced to fall back on rote procedures, habits, rules of thumb, and simple mental models to make decisions. Though we sometimes strive to make the best decisions we can, bounded rationality means we often systematically fall short, limiting our ability to learn from experience." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Just as dynamics arise from feedback, so too all learning depends on feedback. We make decisions that alter the real world; we gather information feedback about the real world, and using the new information we revise our understanding of the world and the decisions we make to bring our perception of the state of the system closer to our goals." (John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000)

"[...] information feedback about the real world not only alters our decisions within the context of existing frames and decision rules but also feeds back to alter our mental models. As our mental models change we change the structure of our systems, creating different decision rules and new strategies. The same information, processed and interpreted by a different decision rule, now yields a different decision. Altering the structure of our systems then alters their patterns of behavior. The development of systems thinking is a double-loop learning process in which we replace a reductionist, narrow, short-run, static view of the world with a holistic, broad, long-term, dynamic view and then redesign our policies and institutions accordingly." (John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000)

"Much of the art of system dynamics modeling is discovering and representing the feedback processes, which, along with stock and flow structures, time delays, and nonlinearities, determine the dynamics of a system. […] the most complex behaviors usually arise from the interactions (feedbacks) among the components of the system, not from the complexity of the components themselves." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"The robustness of the misperceptions of feedback and the poor performance they cause are due to two basic and related deficiencies in our mental model. First, our cognitive maps of the causal structure of systems are vastly simplified compared to the complexity of the systems themselves. Second, we are unable to infer correctly the dynamics of all but the simplest causal maps. Both are direct consequences of bounded rationality, that is, the many limitations of attention, memory, recall, information processing capability, and time that constrain human decision making." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"To avoid policy resistance and find high leverage policies requires us to expand the boundaries of our mental models so that we become aware of and understand the implications of the feedbacks created by the decisions we make. That is, we must learn about the structure and dynamics of the increasingly complex systems in which we are embedded." (John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000) 

"Deep change in mental models, or double-loop learning, arises when evidence not only alters our decisions within the context of existing frames, but also feeds back to alter our mental models. As our mental models change, we change the structure of our systems, creating different decision rules and new strategies. The same information, interpreted by a different model, now yields a different decision. Systems thinking is an iterative learning process in which we replace a reductionist, narrow, short-run, static view of the world with a holistic, broad, long-term, dynamic view, reinventing our policies and institutions accordingly." (John D Sterman, "Learning in and about complex systems", Systems Thinking Vol. 3 2003)

03 June 2006

✒️John H Holland - Collected Quotes

"A useful general definition of mental models must capture several features inherent in our informal descriptions. First, a model must make it possible for the system to generate predictions even though knowledge of the environment is incomplete. Second, it must be easy to refine the model as additional information is acquired without losing useful information already incorporated. Finally, the model must not make requirements on the cognitive system's processing capabilities that are infeasible computationally. In order to be parsi- monious, it must make extensive use of categorization, dividing the environment up into equivalence classes." (John H Holland et al, "Induction: Processes Of Inference, Learning, And Discovery", 1986)

"Although mental models are based in part on static prior knowl- edge, they are themselves transient, dynamic representations of par- ticular unique situations. They exist only implicitly, corresponding to the organized, multifaceted description of the current situation and the expectations that flow from it. Despite their inherently transitory nature - indeed because of it - mental models are the major source of inductive change in long-term knowledge structures." (John H Holland et al, "Induction: Processes Of Inference, Learning, And Discovery", 1986)

"Classifier systems are a kind of rule-based system with general mechanisms for processing rules in parallel, for adaptive generation of new rules, and for testing the effectiveness of existing rules. These mechanisms make possible performance and learning without the "brittleness" characteristic of most expert systems in AI." (John H Holland et al, "Induction: Processes Of Inference, Learning, And Discovery", 1986)

"Deduction is typically distinguished from induction by the fact that only for the former is the truth of an inference guaranteed by the truth of the premises on which it is based. The fact that an inference is a valid deduction, however, is no guarantee that it is of the slightest interest." (John H Holland et al, "Induction: Processes Of Inference, Learning, And Discovery", 1986)

"How can a cognitive system process environmental input and stored knowledge so as to benefit from experience? More specific versions of this question include the following: How can a system organize its experience so that it has some basis for action even in unfamiliar situations? How can a system determine that rules in its knowledge base are inadequate? How can it generate plausible new rules to replace the inadequate ones? How can it refine rules that are useful but non-optimal? How can it use metaphor and analogy to transfer information and procedures from one domain to another?" (John H Holland et al, "Induction: Processes Of Inference, Learning, And Discovery", 1986)

"Our approach assumes that the central problem of induction is to specify processing constraints that will ensure that the inferences drawn by a cognitive system will tend to be plausible and relevant to the system's goals. Which inductions should be characterized as plausible can be determined only with reference to the current knowledge of the system. Induction is thus highly context dependent, being guided by prior knowledge activated in particular situations that confront the system as it seeks to achieve its goals. The study of induction, then, is the study of how knowledge is modified through its use." (John H Holland et al, "Induction: Processes Of Inference, Learning, And Discovery", 1986)

"We will treat problem solving as a process of search through a state space. A problem is defined by an initial state, one or more goal states to be reached, a set of operators that can transform one state into another, and constraints that an acceptable solution must meet. Problem-solving methods are procedures for selecting an appropriate sequence of operators that will succeed in transforming the initial state into a goal state through a series of steps." (John H Holland et al, "Induction: Processes Of Inference, Learning, And Discovery", 1986)

"An internal model allows a system to look ahead to the future consequences of current actions, without actually committing itself to those actions. In particular, the system can avoid acts that would set it irretrievably down some road to future disaster ('stepping off a cliff'). Less dramatically, but equally important, the model enables the agent to make current 'stage-setting' moves that set up later moves that are obviously advantageous. The very essence of a competitive advantage, whether it be in chess or economics, is the discovery and execution of stage-setting moves." (John H Holland, 1992)

"Because the individual parts of a complex adaptive system are continually revising their ('conditioned') rules for interaction, each part is embedded in perpetually novel surroundings (the changing behavior of the other parts). As a result, the aggregate behavior of the system is usually far from optimal, if indeed optimality can even be defined for the system as a whole. For this reason, standard theories in physics, economics, and elsewhere, are of little help because they concentrate on optimal end-points, whereas complex adaptive systems 'never get there'. They continue to evolve, and they steadily exhibit new forms of emergent behavior." (John H Holland, "Complex Adaptive Systems", Daedalus Vol. 121 (1), 1992)

"The systems' basic components are treated as sets of rules. The systems rely on three key mechanisms: parallelism, competition, and recombination. Parallelism permits the system to use individual rules as building blocks, activating sets of rules to describe and act upon the changing situations. Competition allows the system to marshal its rules as the situation demands, providing flexibility and transfer of experience. This is vital in realistic environments, where the agent receives a torrent of information, most of it irrelevant to current decisions. The procedures for adaptation - credit assignment and rule discovery - extract useful, repeatable events from this torrent, incorporating them as new building blocks. Recombination plays a key role in the discovery process, generating plausible new rules from parts of tested rules. It implements the heuristic that building blocks useful in the past will prove useful in new, similar contexts." (John H Holland, "Complex Adaptive Systems", Daedalus Vol. 121 (1), 1992) 

"Even though these complex systems differ in detail, the question of coherence under change is the central enigma for each." (John H Holland, "Hidden Order: How Adaptation Builds Complexity", 1995)

"If we are to understand the interactions of a large number of agents, we must first be able to describe the capabilities of individual agents." (John H Holland, "Hidden Order: How Adaptation Builds Complexity", 1995)

"Model building is the art of selecting those aspects of a process that are relevant to the question being asked. As with any art, this selection is guided by taste, elegance, and metaphor; it is a matter of induction, rather than deduction. High science depends on this art." (John H Holland, "Hidden Order: How Adaptation Builds Complexity", 1995)

"[…] nonlinear interactions almost always make the behavior of the aggregate more complicated than would be predicted by summing or averaging."  (John H Holland, "Hidden Order: How Adaptation Builds Complexity", 1995)

"The multiplier effect is a major feature of networks and flows. It arises regardless of the particular nature of the resource, be it goods, money, or messages." (John H Holland, "Hidden Order: How Adaptation Builds Complexity", 1995)

"With theory, we can separate fundamental characteristics from fascinating idiosyncrasies and incidental features. Theory supplies landmarks and guideposts, and we begin to know what to observe and where to act."(John H Holland, "Hidden Order: How Adaptation Builds Complexity", 1995) 

"It may not be obvious at first, but the study of emergence and model-building go hand in hand. The essence of model-building is shearing away detail to get at essential elements. A model, by concentrating on selected aspects of the world, makes possible the prediction and planning that reveal new possibilities. That is exactly the problem we face in trying to develop a scientific understanding of emergence." (John H Holland, "Emergence", Philosophica 59, 1997)

"Shearing away detail is the very essence of model building. Whatever else we require, a model must be simpler than the thing modeled. In certain kinds of fiction, a model that is identical with the thing modeled provides an interesting device, but it never happens in reality. Even with virtual reality, which may come close to this literary identity one day, the underlying model obeys laws which have a compact description in the computer - a description that generates the details of the artificial world." (John H Holland, "Emergence", Philosophica 59, 1997)

"Strategy in complex systems must resemble strategy in board games. You develop a small and useful tree of options that is continuously revised based on the arrangement of pieces and the actions of your opponent. It is critical to keep the number of options open. It is important to develop a theory of what kinds of options you want to have open." (John H Holland, [presentation] 2000)

✒️W Ross Ashby - Collected Quotes

"Every stable system has the property that if displaced from a state of equilibrium and released, the subsequent movement is so matched to the initial displacement that the system is brought back to the state of equilibrium. A variety of disturbances will therefore evoke a variety of matched reactions." (W Ross Ashby, "Design for a Brain: The Origin of Adaptive Behavior", 1952)

"The primary fact is that all isolated state-determined dynamic systems are selective: from whatever state they have initially, they go towards states of equilibrium. These states of equilibrium are always characterised, in their relation to the change-inducing laws of the system, by being exceptionally resistant." (W Ross Ashby, "Design for a Brain: The Origin of Adaptive Behavior", 1952)

"A common and very powerful constraint is that of continuity. It is a constraint because whereas the function that changes arbitrarily can undergo any change, the continuous function can change, at each step, only to a neighbouring value." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"A most important concept […] is that of constraint. It is a relation between two sets, and occurs when the variety that exists under one condition is less than the variety that exists under another. [...] Constraints are of high importance in cybernetics […] because when a constraint exists advantage can usually be taken of it." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"[…] as every law of nature implies the existence of an invariant, it follows that every law of nature is a constraint. […] Science looks for laws; it is therefore much concerned with looking for constraints. […] the world around us is extremely rich in constraints. We are so familiar with them that we take most of them for granted, and are often not even aware that they exist. […] A world without constraints would be totally chaotic." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"As shorthand, when the phenomena are suitably simple, words such as equilibrium and stability are of great value and convenience. Nevertheless, it should be always borne in mind that they are mere shorthand, and that the phenomena will not always have the simplicity that these words presuppose." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"Cybernetics is likely to reveal a great number of interesting and suggestive parallelisms between machine and brain and society. And it can provide the common language by which discoveries in one branch can readily be made use of in the others. [...] [There are] two peculiar scientific virtues of cybernetics that are worth explicit mention. One is that it offers a single vocabulary and a single set of concepts suitable for representing the most diverse types of system. [...] The second peculiar virtue of cybernetics is that it offers a method for the scientific treatment of the system in which complexity is outstanding and too important to be ignored. Such systems are, as we well know, only too common in the biological world!" (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"[…] information theory is characterised essentially by its dealing always with a set of possibilities; both its primary data and its final statements are almost always about the set as such, and not about some individual element in the set." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"Stability is commonly thought of as desirable, for its presence enables the system to combine of flexibility and activity in performance with something of permanence. Behaviour that is goal-seeking is an example of behaviour that is stable around a state of equilibrium. Nevertheless, stability is not always good, for a system may persist in returning to some state that, for other reasons, is considered undesirable." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"[...] the concept of 'feedback', so simple and natural in certain elementary cases, becomes artificial and of little use when the interconnexions between the parts become more complex. When there are only two parts joined so that each affects the other, the properties of the feedback give important and useful information about the properties of the whole. But when the parts rise to even as few as four, if every one affects the other three, then twenty circuits can be traced through them; and knowing the properties of all the twenty circuits does not give complete information about the system. Such complex systems cannot be treated as an interlaced set of more or less independent feedback circuits, but only as a whole. For understanding the general principles of dynamic systems, therefore, the concept of feedback is inadequate in itself. What is important is that complex systems, richly cross-connected internally, have complex behaviours, and that these behaviours can be goal-seeking in complex patterns." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"There comes a stage, however, as the system becomes larger and larger, when the reception of all the information is impossible by reason of its sheer bulk. Either the recording channels cannot carry all the information, or the observer, presented with it all, is overwhelmed. When this occurs, what is he to do? The answer is clear: he must give up any ambition to know the whole system. His aim must be to achieve a partial knowledge that, though partial over the whole, is none the less complete within itself, and is sufficient for his ultimate practical purpose." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"This 'statistical' method of specifying a system - by specification of distributions with sampling methods - should not be thought of as essentially different from other methods. It includes the case of the system that is exactly specified, for the exact specification is simply one in which each distribution has shrunk till its scatter is zero, and in which, therefore, 'sampling' leads to one inevitable result. What is new about the statistical system is that the specification allows a number of machines, not identical, to qualify for inclusion. The statistical 'machine' should therefore be thought of as a set of machines rather than as one machine." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"Every isolated determinate dynamic system, obeying unchanging laws, will ultimately develop some sort of organisms that are adapted to their environments." (W Ross Ashby, "Principles of the self-organizing system", 1962)

"To say a system is 'self-organizing' leaves open two quite different meanings. There is a first meaning that is simple and unobjectionable. This refers to the system that starts with its parts separate (so that the behavior of each is independent of the others' states) and whose parts then act so that they change towards forming connections of some type. Such a system is 'self-organizing' in the sense that it changes from 'parts separated' to 'parts joined'. […] In general such systems can be more simply characterized as 'self-connecting', for the change from independence between the parts to conditionality can always be seen as some form of 'connection', even if it is as purely functional […]" (W Ross Ashby, "Principles of the self-organizing system", 1962)

02 June 2006

✒️Lawrence K Samuels - Collected Quotes

"Complexity has the propensity to overload systems, making the relevance of a particular piece of information not statistically significant. And when an array of mind-numbing factors is added into the equation, theory and models rarely conform to reality." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"Complexity scientists concluded that there are just too many factors - both concordant and contrarian - to understand. And with so many potential gaps in information, almost nobody can see the whole picture. Complex systems have severe limits, not only to predictability but also to measurability. Some complexity theorists argue that modelling, while useful for thinking and for studying the complexities of the world, is a particularly poor tool for predicting what will happen." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"If an emerging system is born complex, there is neither leeway to abandon it when it fails, nor the means to join another, successful one. Such a system would be caught in an immovable grip, congested at the top, and prevented, by a set of confusing but locked–in precepts, from changing." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013) 

"Simplicity in a system tends to increase that system’s efficiency. Because less can go wrong with fewer parts, less will. Complexity in a system tends to increase that system’s inefficiency; the greater the number of variables, the greater the probability of those variables clashing, and in turn, the greater the potential for conflict and disarray. Because more can go wrong, more will. That is why centralized systems are inclined to break down quickly and become enmeshed in greater unintended consequences." (Lawrence K Samuels,"Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"Under complexity science, the more interacting factors, the more unpredictable and irregular the outcome. To be succinct, the greater the complexity, the greater the unpredictability." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"Without precise predictability, control is impotent and almost meaningless. In other words, the lesser the predictability, the harder the entity or system is to control, and vice versa. If our universe actually operated on linear causality, with no surprises, uncertainty, or abrupt changes, all future events would be absolutely predictable in a sort of waveless orderliness." (Lawrence K Samuels, "Defense of Chaos", 2013)

"The problem of complexity is at the heart of mankind’s inability to predict future events with any accuracy. Complexity science has demonstrated that the more factors found within a complex system, the more chances of unpredictable behavior. And without predictability, any meaningful control is nearly impossible. Obviously, this means that you cannot control what you cannot predict. The ability ever to predict long-term events is a pipedream. Mankind has little to do with changing climate; complexity does." (Lawrence K Samuels, "The Real Science Behind Changing Climate", LewRockwell.com, August 1, 2014) 

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.