"A complex system can fail in an infinite number of ways."
"A complex system that works is invariably found to have evolved from a simple system that worked. The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system." (John Gall, "General Systemantics: How systems work, and especially how they fail", 1975)
"A system represents someone's solution to a problem. The system doesn't solve the problem."
"Systems Are Seductive. They promise to do a hard job faster, better, and more easily than you could do it by yourself. But if you set up a system, you are likely to find your time and effort now being consumed in the care and feeding of the system itself. New problems are created by its very presence. Once set up, it won't go away, it grows and encroaches. It begins to do strange and wonderful things. Breaks down in ways you never thought possible. It kicks back, gets in the way, and opposes its own proper function. Your own perspective becomes distorted by being in the system. You become anxious and push on it to make it work. Eventually you come to believe that the misbegotten product it so grudgingly delivers is what you really wanted all the time. At that point encroachment has become complete. You have become absorbed. You are now a systems person."
"The failure of individual subsystems to be sufficiently adaptive to changing environments results in the subsystems forming a collective association that, as a unit, is better able to function in new circumstances. Formation of such an association is a structural change; the behavioral role of the new conglomerate is a junctional change; both types of change are characteristic of the formation of hierarchies."
"The system always kicks back. - Systems get in the way - or, in slightly more elegant language: Systems tend to oppose their own proper functions. Systems tend to malfunction conspicuously just after their greatest triumph." (John Gall, "Systemantics: The underground text of systems lore", 1986)
"Physical systems are subject to the force of entropy, which increases until eventually the entire system fails. The tendency toward maximum entropy is a movement to disorder, complete lack of resource transformation, and death." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)
"Most systems displaying a high degree of tolerance against failures are a common feature: Their functionality is guaranteed by a highly interconnected complex network. A cell's robustness is hidden in its intricate regulatory and metabolic network; society's resilience is rooted in the interwoven social web; the economy's stability is maintained by a delicate network of financial and regulator organizations; an ecosystem's survivability is encoded in a carefully crafted web of species interactions. It seems that nature strives to achieve robustness through interconnectivity. Such universal choice of a network architecture is perhaps more than mere coincidences." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)
"A fundamental reason for the difficulties with modern engineering projects is their inherent complexity. The systems that these projects are working with or building have many interdependent parts, so that changes in one part often have effects on other parts of the system. These indirect effects are frequently unanticipated, as are collective behaviors that arise from the mutual interactions of multiple components. Both indirect and collective effects readily cause intolerable failures of the system. Moreover, when the task of the system is intrinsically complex, anticipating the many possible demands that can be placed upon the system, and designing a system that can respond in all of the necessary ways, is not feasible. This problem appears in the form of inadequate specifications, but the fundamental issue is whether it is even possible to generate adequate specifications for a complex system." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)
"It is no longer sufficient for engineers merely to design boxes such as computers with the expectation that they would become components of larger, more complex systems. That is wasteful because frequently the box component is a bad fit in the system and has to be redesigned or worse, can lead to system failure. We must learn how to design large-scale, complex systems from the top down so that the specification for each component is derivable from the requirements for the overall system. We must also take a much larger view of systems. We must design the man-machine interfaces and even the system-society interfaces. Systems engineers must be trained for the design of large-scale, complex, man-machine-social systems." (A Wayne Wymore, "Systems Movement: Autobiographical Retrospectives", 2004)
"[…] in cybernetics, control is seen not as a function of one agent over something else, but as residing within circular causal networks, maintaining stabilities in a system. Circularities have no beginning, no end and no asymmetries. The control metaphor of communication, by contrast, punctuates this circularity unevenly. It privileges the conceptions and actions of a designated controller by distinguishing between messages sent in order to cause desired effects and feedback that informs the controller of successes or failures." (Klaus Krippendorff, "On Communicating: Otherness, Meaning, and Information", 2009)
"Experts in the 'Problem' area proceed to elaborate its complexity. They design complex Systems to attack it. This approach guarantees failure, at least for all but the most pedestrian tasks. The problem is a Problem precisely because it is incorrectly conceptualized in the first place, and a large System for studying and attacking the Problem merely locks in the erroneous conceptualization into the minds of everyone concerned. What is required is not a large System, but a different approach. Trying to design a System in the hope that the System will somehow solve the Problem, rather than simply solving the Problem in the first place, is to present oneself with two problems in place of one." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)
"Pragmatically, it is generally easier to aim at changing one or a few things at a time and then work out the unexpected effects, than to go to the opposite extreme. Attempting to correct everything in one grand design is appropriately designated as Grandiosity. […] A little Grandiosity goes a long way. […] The diagnosis of Grandiosity is quite elegantly and strictly made on a purely quantitative basis: How many features of the present System, and at what level, are to be corrected at once? If more than three, the plan is grandiose and will fail." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)
"Complex systems seem to have this property, with large periods of apparent stasis marked by sudden and catastrophic failures. These processes may not literally be random, but they are so irreducibly complex (right down to the last grain of sand) that it just won’t be possible to predict them beyond a certain level. […] And yet complex processes produce order and beauty when you zoom out and look at them from enough distance."
"If an emerging system is born complex, there is neither leeway to abandon it when it fails, nor the means to join another, successful one. Such a system would be caught in an immovable grip, congested at the top, and prevented, by a set of confusing but locked–in precepts, from changing." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)
"Stability is often defined as a resilient system that keeps processing transactions, even if transient impulses (rapid shocks to the system), persistent stresses (force applied to the system over an extended period), or component failures disrupt normal processing." (Michael Hüttermann et al, "DevOps for Developers", 2013)
"Although cascading failures may appear random and unpredictable, they follow reproducible laws that can be quantified and even predicted using the tools of network science. First, to avoid damaging cascades, we must understand the structure of the network on which the cascade propagates. Second, we must be able to model the dynamical processes taking place on these networks, like the flow of electricity. Finally, we need to uncover how the interplay between the network structure and dynamics affects the robustness of the whole system." (Albert-László Barabási, "Network Science", 2016)