30 June 2006

John L Casti - Collected Quotes

"[…] a complex system is incomprehensible unless we can simplify it by using alternative levels of description." (John L Casti, "On System Complexity: Identification, Measurement, and Management" [in "Complexity, Language, and Life: Mathematical Approaches"] 1986)

"Coping with complexity involves the creation of faithful models of not only the system to be managed. but also of the management system itself." (John L Casti, "On System Complexity: Identification, Measurement, and Management" [in "Complexity, Language, and Life: Mathematical Approaches"] 1986)

"[…] complexity emerges from simplicity when alternative descriptions of a system are not reducible to each other. For a given observer, the more such inequivalent descriptions he or she generates, the more complex the system appears. Conversely, a complex system can be simplified in one of two ways: reduce the number of potential descriptions (by restricting the observer's means of interaction with the system) and/or use a coarser notion of system equivalence, thus reducing the number of equivalence classes." (John L Casti, "On System Complexity: Identification, Measurement, and Management" [in "Complexity, Language, and Life: Mathematical Approaches"] 1986)

"Simple systems generally involve a small number of components. with self-interaction dominating the mutual interaction of the variables. […] Besides involving only a few variables. simple systems generally have very few feedback/feedforward loops. Such loops enable the system to restructure. or at least modify. the interaction pattern of its variables. thereby opening-up the possibility of a wider range of potential behavior patterns." (John L Casti, "On System Complexity: Identification, Measurement, and Management" [in "Complexity, Language, and Life: Mathematical Approaches"] 1986)

"[…] a model is a mathematical representation of the modeler's reality, a way of capturing some aspects of a particular reality within the framework of a mathematical apparatus that provides us with a means for exploring the properties of the reality mirrored in the model." (John L Casti, "Reality Rules: Picturing the world in mathematics", 1992)

"Basically, the point of making models is to be able to bring a measure of order to our experiences and observations, as well as to make specific predictions about certain aspects of the world we experience." (John L Casti, "Reality Rules: Picturing the world in mathematics", 1992)

"Reliable information processing requires the existence of a good code or language, i.e., a set of rules that generate information at a given hierarchical level, and then compress it for use at a higher cognitive level. To accomplish this, a language should strike an optimum balance between variety (stochasticity) and the ability to detect and correct errors (memory)."(John L Casti, "Reality Rules: Picturing the world in mathematics", 1992)

"Skewness is a measure of symmetry. For example, it's zero for the bell-shaped normal curve, which is perfectly symmetric about its mean. Kurtosis is a measure of the peakedness, or fat-tailedness, of a distribution. Thus, it measures the likelihood of extreme values." (John L Casti, "Reality Rules: Picturing the world in mathematics", 1992)

"[…] the complexity of a given system is always determined relative to another system with which the given system interacts. Only in extremely special cases, where one of these reciprocal interactions is so much weaker than the other that it can be ignored, can we justify the traditional attitude regarding complexity as an intrinsic property of the system itself." (John L Casti, "Reality Rules: Picturing the world in mathematics", 1992)

"The core of a decision problem is always to find a single method that can be applied to each question, and that will always give the correct answer for each individual problem." (John L Casti, "Mathematical Mountaintops: The Five Most Famous Problems of All Time", 2001)

"[…] according to the bell-shaped curve the likelihood of a very-large-deviation event (a major outlier) located in the striped region appears to be very unlikely, essentially zero. The same event, though, is several thousand times more likely if it comes from a set of events obeying a fat-tailed distribution instead of the bell-shaped one." (John L Casti, "X-Events: The Collapse of Everything", 2012)

"[…] both rarity and impact have to go into any meaningful characterization of how black any particular [black] swan happens to be." (John L Casti, "X-Events: The Collapse of Everything", 2012)

"Due to the problem of predicting outlier events, they are not usually factored into the design of systems." (John L Casti, "X-Events: The Collapse of Everything", 2012)

"Forecasting models […] ordinarily are based only on past data, which is generally a tiny sample of the total range of possible outcomes. The problem is that those 'experts' who develop the models often come to believe they have mapped the entire space of possible system behaviors, which could not be further from the truth. Worse yet, when outliers do crop up, they are often discounted as 'once in a century' events and are all but ignored in planning for the future. […] the world is much more unpredictable than we’d like to believe."(John L Casti, "X-Events: The Collapse of Everything", 2012)

"If you want a system - economic, social, political, or otherwise - to operate at a high level of efficiency, then you have to optimize its operation in such a way that its resilience is dramatically reduced to unknown - and possibly unknowable - shocks and/or changes in its operating environment. In other words, there is an inescapable price to be paid in efficiency in order to gain the benefits of adaptability and survivability in a highly uncertain environment. There is no escape clause!" (John L Casti, "X-Events: The Collapse of Everything", 2012)

"Sustainability is a delicate balancing act calling upon us to remain on the narrow path between organization and chaos, simplicity and complexity." (John L Casti, "X-Events: The Collapse of Everything", 2012)

"The first path of increasing complexity via innovation often faces limits as to how much complexity can be added or reduced in a given system. This is because if you change the complexity level in one place, a compensating change in the opposite direction generally occurs somewhere else." (John L Casti, "X-Events: The Collapse of Everything", 2012)

"[…] the law of requisite complexity […] states that in order to fully regulate/control a system, the complexity of the controller has to be at least as great as the complexity of the system that’s being controlled. To put it in even simpler terms, only complexity can destroy complexity." (John L Casti, "X-Events: The Collapse of Everything", 2012)

"What is and isn’t complex depends to a large degree not only on a target system, but also on the system(s) the target interacts with, together with the overall context in which the interacting systems are embedded." (John L Casti, "X-Events: The Collapse of Everything", 2012)

Jay W Forrester - Collected Quotes

"[System dynamics] is an approach that should help in important top-management problems [...] The solutions to small problems yield small rewards. Very often the most important problems are but little more difficult to handle than the unimportant. Many [people] predetermine mediocre results by setting initial goals too low. The attitude must be one of enterprise design. The expectation should be for major improvement [...] The attitude that the goal is to explain behavior; which is fairly common in academic circles, is not sufficient. The goal should be to find management policies and organizational structures that lead to greater success." (Jay W Forrester, "Industrial Dynamics", 1961)

"In complex systems cause and effect are often not closely related in either time or space. The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by nonlinear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops. In the complex system the cause of a difficulty may lie far back in time from the symptoms, or in a completely different and remote part of the system. In fact, causes are usually found, not in prior events, but in the structure and policies of the system." (Jay W Forrester, "Urban dynamics", 1969)

"Like all systems, the complex system is an interlocking structure of feedback loops [...] This loop structure surrounds all decisions public or private, conscious or unconscious. The processes of man and nature, of psychology and physics, of medicine and engineering all fall within this structure [...]" (Jay W Forrester, "Urban Dynamics", 1969)

"The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by non-linear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops." (Jay F Forrester, "Urban Dynamics", 1969)

"To model the dynamic behavior of a system, four hierarchies of structure should be recognized: closed boundary around the system; feedback loops as the basic structural elements within the boundary; level variables representing accumulations within the feedback loops; rate variables representing activity within the feedback loops." (Jay W Forrester, "Urban Dynamics", 1969)

"First, social systems are inherently insensitive to most policy changes that people choose in an effort to alter the behavior of systems. In fact, social systems draw attention to the very points at which an attempt to intervene will fail. Human intuition develops from exposure to simple systems. In simple systems, the cause of a trouble is close in both time and space to symptoms of the trouble. If one touches a hot stove, the burn occurs here and now; the cause is obvious. However, in complex dynamic systems, causes are often far removed in both time and space from the symptoms. True causes may lie far back in time and arise from an entirely different part of the system from when and where the symptoms occur. However, the complex system can mislead in devious ways by presenting an apparent cause that meets the expectations derived from simple systems." (Jay W Forrester, "Counterintuitive Behavior of Social Systems", 1995)

"Second, social systems seem to have a few sensitive influence points through which behavior can be changed. These high-influence points are not where most people expect. Furthermore, when a high-influence policy is identified, the chances are great that a person guided by intuition and judgment will alter the system in the wrong direction." (Jay W Forrester, "Counterintuitive Behavior of Social Systems", 1995)

"System dynamics models are not derived statistically from time-series data. Instead, they are statements about system structure and the policies that guide decisions. Models contain the assumptions being made about a system. A model is only as good as the expertise which lies behind its formulation. A good computer model is distinguished from a poor one by the degree to which it captures the essence of a system that it represents. Many other kinds of mathematical models are limited because they will not accept the multiple-feedback-loop and nonlinear nature of real systems." (Jay W Forrester, "Counterintuitive Behavior of Social Systems", 1995)

"Third, social systems exhibit a conflict between short-term and long-term consequences of a policy change. A policy that produces improvement in the short run is usually one that degrades a system in the long run. Likewise, policies that produce long-run improvement may initially depress behavior of a system. This is especially treacherous. The short run is more visible and more compelling. Short-run pressures speak loudly for immediate attention. However, sequences of actions all aimed at short-run improvement can eventually burden a system with long-run depressants so severe that even heroic short-run measures no longer suffice. Many problems being faced today are the cumulative result of short-run measures taken in prior decades." (Jay W Forrester, "Counterintuitive Behavior of Social Systems", 1995)

"No plea about inadequacy of our understanding of the decision-making processes can excuse us from estimating decision making criteria. To omit a decision point is to deny its presence - a mistake of far greater magnitude than any errors in our best estimate of the process." (Jay W Forrester, "Perspectives on the modelling process", 2000)

28 June 2006

John Gall - Collected Quotes

"A complex system can fail in an infinite number of ways." (John Gall, "General Systemantics: How systems work, and especially how they fail", 1975)

"A complex system that works is invariably found to have evolved from a simple system that worked. The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system." (John Gall, "General Systemantics: How systems work, and especially how they fail", 1975)

"A system represents someone's solution to a problem. The system doesn't solve the problem." (John Gall, "General Systemantics: How systems work, and especially how they fail", 1975)

"Loose systems last longer and function better." (John Gall, "General Systemantics: How systems work, and especially how they fail", 1975)

"Systems Are Seductive. They promise to do a hard job faster, better, and more easily than you could do it by yourself. But if you set up a system, you are likely to find your time and effort now being consumed in the care and feeding of the system itself. New problems are created by its very presence. Once set up, it won't go away, it grows and encroaches. It begins to do strange and wonderful things. Breaks down in ways you never thought possible. It kicks back, gets in the way, and opposes its own proper function. Your own perspective becomes distorted by being in the system. You become anxious and push on it to make it work. Eventually you come to believe that the misbegotten product it so grudgingly delivers is what you really wanted all the time. At that point encroachment has become complete. You have become absorbed. You are now a systems person." (John Gall, "General Systemantics: How systems work, and especially how they fail", 1975)

"The following four propositions, which appear to the author to be incapable of formal proof, are presented as Fundamental Postulates upon which the entire superstructure of General Systemantics [...] is based [...] (1) Everything is a system. (2) Everything is part of a larger system. (3) The universe is infinitely systematizable, both upward (larger systems) and downward (smaller systems) (4) All systems are infinitely complex. (The illusion of simplicity comes from focusing attention on one or a few variables.)" (John Gall, "General Systemantics: How systems work, and especially how they fail", 1975)

"The system always kicks back. - Systems get in the way - or, in slightly more elegant language: Systems tend to oppose their own proper functions. Systems tend to malfunction conspicuously just after their greatest triumph." (John Gall, "Systemantics: The underground text of systems lore", 1986)

"Alternating positive and negative feedback produces a special form of stability represented by endless oscillation between two polar states or conditions." (John Gall, "Systemantics: The Systems Bible", 2002)

"[...] the System may be so thoroughly organized around the familiar response strategy that a new response would require extensive restructuring - something that Systems do with the greatest reluctance and difficulty." (John Gall, "Systemantics: The Systems Bible", 2002)

"The function performed by a System is not operationally identical to the function of the same name performed by a person. In general, a function performed by a larger System is not operationally identical to the function of the same name as performed by a smaller System." (John Gall, "Systemantics: The Systems Bible", 2002)

"Systems-people everywhere share certain attributes, but each specific System tends to attract people with specific sets of traits. […] Systems attract not only Systems-people who have qualities making for success within the System; they also attract individuals who possess specialized traits adapted to allow them to thrive at the expense of the System; i.e., persons who parasitize the System." (John Gall, "Systemantics: The Systems Bible", 2002)

"We are accustomed to thinking that a System acts like a machine, and that if we only knew its mechanism, we could understand, even predict, its behavior. This is wrong. The correct orientation is:  - and if the machine is large and complex enough, it will act like a large System. We simply have our metaphors backwards." (John Gall, "Systemantics: The Systems Bible", 2002)

"When a system is set up to accomplish some goal, a new entity has come into being - the system itself. No matter what the 'goal' of the system, it immediately begins to exhibit systems-behavior, that is, to act according to the general laws that govern the operation of all systems." (John Gall, "Systemantics: The Systems Bible", 2002)

"Almost by definition, one is rarely privileged to 'control' a disaster. Yet the activity somewhat loosely referred to by this term is a substantial portion of Management, perhaps the most important part. […] It is the business of a good Manager to ensure, by taking timely action in the real world, that scenarios of disaster remain securely in the realm of Fantasy." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"But the history of large systems demonstrates that, once the hurdle of stability has been cleared, a more subtle challenge appears. It is the challenge of remaining stable when the rules change. Machines, like organizations or organisms, that fail to meet this challenge find that their previous stability is no longer of any use. The responses that once were life-saving now just make things worse. What is needed now is the capacity to re-write the procedure manual on short notice, or even (most radical change of all) to change goals." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"Clearly, total feedback is Not a Good Thing. Too much feedback can over- whelm the response channels, leading to paralysis and inaction. Even in a system designed to accept massive feedback (such as the human brain), if the system is required to accommodate to all incoming data, equilibrium will never be reached. The point of decision will be delayed indefinitely, and no action will be taken." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"[…] even though a System may function very poorly, it can still tend to Expand to Fill the Known Universe, and Positive Feedback only encourages that tendency." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"Experts in the 'Problem' area proceed to elaborate its complexity. They design complex Systems to attack it. This approach guarantees failure, at least for all but the most pedestrian tasks. The problem is a Problem precisely because it is incorrectly conceptualized in the first place, and a large System for studying and attacking the Problem merely locks in the erroneous conceptualization into the minds of everyone concerned. What is required is not a large System, but a different approach. Trying to design a System in the hope that the System will somehow solve the Problem, rather than simply solving the Problem in the first place, is to present oneself with two problems in place of one." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"In point of fact, the System may be so thoroughly organized around the familiar response strategy that a new response would require extensive restructuring - something that Systems do with the greatest reluctance and difficulty." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"Information Theory is a mathematical treatment of what is left after the meanings have been removed from a Communication." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"Not only Nature, but Systems generally, cannot be wise when feedbacks are unduly delayed. Feedback is likely to cause trouble if it is either too slow or too prompt. It must be adjusted to the response rhythms of the system as well as to the tempo of the actual events - a double restriction." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"Nothing is more useless than struggling against a Law of Nature. On the other hand, there are circumstances (highly unusual and narrowly defined, of course) when one’s knowledge of Systems-functions will provide precisely the measure of extra added ability needed to tip the scales of a doubtful operation in one’s favor." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"Pragmatically, it is generally easier to aim at changing one or a few things at a time and then work out the unexpected effects, than to go to the opposite extreme. Attempting to correct everything in one grand design is appropriately designated as Grandiosity. […] A little Grandiosity goes a long way. […] The diagnosis of Grandiosity is quite elegantly and strictly made on a purely quantitative basis: How many features of the present System, and at what level, are to be corrected at once? If more than three, the plan is grandiose and will fail." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"Striving for Perfection produces a kind of tunnel-vision resembling a hypnotic state. Absorbed in the pursuit of perfecting the System at hand, the striver has no energy or attention left over for considering other, possibly better, ways of doing the whole thing." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"Systems are never dealing with the real world that the rest of us have to live in, but instead with a filtered, distorted, and censored version which is all that can get past the sensory organs of the System itself." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"[…] the System has its effects on the people within it. It isolates them, feeds them a distorted and partial version of the outside world, and gives them the illusion of power and effectiveness."  (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

26 June 2006

Donella H Meadows - Collected Quotes

"Everything we think we know about the world is a model. Every word and every language is a model. All maps and statistics, books and databases, equations and computer programs are models. So are the ways I picture the world in my head - my mental models. None of these is or ever will be the real world. […] Our models usually have a strong congruence with the world. That is why we are such a successful species in the biosphere. Especially complex and sophisticated are the mental models we develop from direct, intimate experience of nature, people, and organizations immediately around us." (Donella H Meadows, "Limits to Growth", 1972)

"However, and conversely, our models fall far short of representing the world fully. That is why we make mistakes and why we are regularly surprised. In our heads, we can keep track of only a few variables at one time. We often draw illogical conclusions from accurate assumptions, or logical conclusions from inaccurate assumptions. Most of us, for instance, are surprised by the amount of growth an exponential process can generate. Few of us can intuit how to damp oscillations in a complex system." (Donella H Meadows, "Limits to Growth", 1972)

"Technology can relieve the symptoms of a problem without affecting the underlying causes. Faith in technology as the ultimate solution to all problems can thus divert our attention from the most fundamental problem - the problem of growth in a finite system - and prevent us from taking effective action to solve it." (Donella H Meadows, "The Limits to Growth", 1972)

"Models can easily become so complex that they are impenetrable, unexaminable, and virtually unalterable." (Donella H Meadows, "The unavoidable a priori", 1980)

"The world is a complex, interconnected, finite, ecological–social–psychological–economic system. We treat it as if it were not, as if it were divisible, separable, simple, and infinite. Our persistent, intractable global problems arise directly from this mismatch." (Donella H Meadows, "Whole Earth Models and System", 1982)

"A quantity growing exponentially toward a limit reaches that limit in a surprisingly short time." (Donella Meadows, "Thinking in systems: A Primer", 2008)

"A system is a set of things – people, cells, molecules, or whatever – interconnected in such a way that they produce their own pattern of behavior over time. […] The system, to a large extent, causes its own behavior." (Donella H Meadows, “Thinking in Systems: A Primer”, 2008) 

"In fact, one of the most frustrating aspects of systems is that the purposes of subunits may add up to an overall behavior that no one wants." (Donella H Meadows, “Thinking in Systems: A Primer”, 2008)"

"In physical, exponentially growing systems, there must be at least one reinforcing loop driving growth and at least one balancing feedback loop constraining growth, because no system can grow forever in a finite environment." (Donella H Meadows, “Thinking in Systems: A Primer”, 2008)

"Like resilience, self-organizazion is often sacrificed for purposes of short-term productivity and stability." (Donella H Meadows, “Thinking in Systems: A Primer”, 2008)

"Our culture, obsessed with numbers, has given us the idea that what we can measure is more important than what we can't measure. Think about that for a minute. It means that we make quantity more important than quality." (Donella H Meadows, "Thinking in Systems: A Primer", 2008)

"[…] our mental models fail to take into account the complications of the real world - at least those ways that one can see from a systems perspective. It is a warning list. Here is where hidden snags lie. You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long-term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays. You are likely to mistreat, misdesign, or misread systems if you don’t respect their properties of resilience, self-organization, and hierarchy." (Donella H Meadows, “Thinking in Systems: A Primer”, 2008)

"The bounded rationality of each actor in a system may not lead to decisions that further the welfare of the system as a whole." (Donella H Meadows, “Thinking in Systems: A Primer”, 2008)

"The world is nonlinear. Trying to make it linear for our mathematical or administrative convenience is not usually a good idea even when feasible, and it is rarely feasible." (Donella H Meadow, "Thinking in Systems: A Primer", 2008)

"When there are long delays in feedback loops, some sort of foresight is essential." (Donella H Meadows, “Thinking in Systems: A Primer”, 2008)

"You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays." (Donella H Meadow, "Thinking in Systems: A Primer", 2008)

25 June 2006

Paul Cilliers - Collected Quotes

"A neural network consists of large numbers of simple neurons that are richly interconnected. The weights associated with the connections between neurons determine the characteristics of the network. During a training period, the network adjusts the values of the interconnecting weights. The value of any specific weight has no significance; it is the patterns of weight values in the whole system that bear information. Since these patterns are complex, and are generated by the network itself (by means of a general learning strategy applicable to the whole network), there is no abstract procedure available to describe the process used by the network to solve the problem. There are only complex patterns of relationships." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)

"Each element in the system is ignorant of the behavior of the system as a whole, it responds only to information that is available to it locally. This point is vitally important. If each element ‘knew’ what was happening to the system as a whole, all of the complexity would have to be present in that element." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems" , 1998)

"Complex systems operate under conditions far from equilibrium. Complex systems need a constant flow of energy to change, evolve and survive as complex entities. Equilibrium, symmetry and complete stability mean death. Just as the flow, of energy is necessary to fight entropy and maintain the complex structure of the system, society can only survive as a process. It is defined not by its origins or its goals, but by what it is doing." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"From a more general philosophical perspective we can say that we wish to model complex systems because we want to understand them better.  The main requirement for our models accordingly shifts from having to be correct to being rich in information.  This does not mean that the relationship between the model and the system itself becomes less important, but the shift from control and prediction to understanding does have an effect on our approach to complexity: the evaluation of our models in terms of performance can be deferred. Once we have a better understanding of the dynamics of complexity, we can start looking for the similarities and differences between different complex systems and thereby develop a clearer understanding of the strengths and limitations of different models." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"Meaning is conferred not by a one-to-one correspondence of a symbol with some external concept or object, but by the relationships between the structural components of the system itself." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)

"Modelling techniques on powerful computers allow us to simulate the behaviour of complex systems without having to understand them.  We can do with technology what we cannot do with science.  […] The rise of powerful technology is not an unconditional blessing.  We have  to deal with what we do not understand, and that demands new  ways of thinking." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"The ability of neural networks to operate successfully on inputs that did not form part of the training set is one of their most important characteristics. Networks are capable of finding common elements in all the training examples belonging to the same class, and will then respond appropriately when these elements are encountered again. Optimising this capability is an important consideration when designing a network." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)

"The concept ‘complexity’ is not univocal either. Firstly, it is useful to distinguish between the notions ‘complex’ and ‘complicated’. If a system- despite the fact that it may consist of a huge number of components - can be given a complete description in terms of its individual constituents, such a system is merely complicated. […] In a complex system, on the other hand, the interaction among constituents of the system, and the interaction between the system and its environment, are of such a nature that the system as a whole cannot be fully understood simply by analysing its components. Moreover, these relationships are not fixed, but shift and change, often as a result of self-organisation. This can result in novel features, usually referred to in terms of emergent properties." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems" , 1998)

"There is no over-arching theory of complexity that allows us to ignore the contingent aspects of complex systems. If something really is complex, it cannot by adequately described by means of a simple theory. Engaging with complexity entails engaging with specific complex systems. Despite this we can, at a very basic level, make general remarks concerning the conditions for complex behaviour and the dynamics of complex systems. Furthermore, I suggest that complex systems can be modelled." (Paul Cilliers," Complexity and Postmodernism", 1998)

22 June 2006

Kenneth E Boulding - Collected Quotes

"Knowledge is not something which exists and grows in the abstract. It is a function of human organisms and of social organization. Knowledge, that is to say, is always what somebody knows: the most perfect transcript of knowledge in writing is not knowledge if nobody knows it. Knowledge however grows by the receipt of meaningful information - that is, by the intake of messages by a knower which are capable of reorganising his knowledge." (Kenneth E Boulding, "General Systems Theory: The Skeleton of Science", Management Science Vol. 2 (3), 1956)

"One advantage of exhibiting a hierarchy of systems in this way is that it gives us some idea of the present gaps in both theoretical and empirical knowledge. Adequate theoretical models extend up to about the fourth level, and not much beyond. Empirical knowledge is deficient at practically all levels." (Kenneth E Boulding, "General Systems Theory: The Skeleton of Science", 1956)

"It is important to realize that the exercise of any skill depends on the ability to create an abstract system of some kind out of the totality of the world around us." (Kenneth E Boulding, "The Skills of the Economist", 1958)

"The idea of knowledge as an improbable structure is still a good place to start. Knowledge, however, has a dimension which goes beyond that of mere information or improbability. This is a dimension of significance which is very hard to reduce to quantitative form. Two knowledge structures might be equally improbable but one might be much more significant than the other." (Kenneth E Boulding, "Beyond Economics: Essays on Society", 1968)

"It [knowledge] is clearly related to information, which we can now measure; and an economist especially is tempted to regard knowledge as a kind of capital structure, corresponding to information as an income flow. Knowledge, that is to say, is some kind of improbable structure or stock made up essentially of patterns - that is, improbable arrangements, and the more improbable the arrangements, we might suppose, the more knowledge there is." (Kenneth E Boulding, "Beyond Economics: Essays on Society", 1968)

"The human condition can almost be summed up in the observation that, whereas all experiences are of the past, all decisions are about the future. It is the great task of human knowledge to bridge this gap and to find those patterns in the past which can be projected into the future as realistic images." (Kenneth E Boulding, [foreword] 1972)

"We never like to admit to ourselves that we have made a mistake. Organizational structures tend to accentuate this source of failure of information." (Kenneth E Boulding, "Toward a General Social Science", 1974)

"Prediction of the future is possible only in systems that have stable parameters like celestial mechanics. The only reason why prediction is so successful in celestial mechanics is that the evolution of the solar system has ground to a halt in what is essentially a dynamic equilibrium with stable parameters. Evolutionary systems, however, by their very nature have unstable parameters. They are disequilibrium systems and in such systems our power of prediction, though not zero, is very limited because of the unpredictability of the parameters themselves. If, of course, it were possible to predict the change in the parameters, then there would be other parameters which were unchanged, but the search for ultimately stable parameters in evolutionary systems is futile, for they probably do not exist… Social systems have Heisenberg principles all over the place, for we cannot predict the future without changing it." (Kenneth E Boulding, "Evolutionary Economics", 1981)

19 June 2006

Stephen G Haines - Collected Quotes

"Delay time, the time between causes and their impacts, can highly influence systems. Yet the concept of delayed effect is often missed in our impatient society, and when it is recognized, it’s almost always underestimated. Such oversight and devaluation can lead to poor decision making as well as poor problem solving, for decisions often have consequences that don’t show up until years later. Fortunately, mind mapping, fishbone diagrams, and creativity/brainstorming tools can be quite useful here." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"Our simplistic cause-effect analyses, especially when coupled with the desire for quick fixes, usually lead to far more problems than they solve - impatience and knee-jerk reactions included. If we stop for a moment and take a good look our world and its seven levels of complex and interdependent systems, we begin to understand that multiple causes with multiple effects are the true reality, as are circles of causality-effects." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"Strategic planning and strategic change management are really 'strategic thinking'. It’s about clarity and simplicity, meaning and purpose, and focus and direction." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Systems thinking is based on the theory that a system is, in essence, circular. Using a systems approach in your strategic management, therefore, provides a circular implementing structure that can evolve, with continuously improving, self-checking, and learning capabilities [...]" (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"The systems approach, on the other hand, provides an expanded structural design of organizations as living systems that more accurately reflects reality." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"This is what systems thinking is all about: the idea of building an organization in which each piece, and partial solution of the organization has the fit, alignment, and integrity with your overall organization as a system, and its outcome of serving the customer." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"True systems thinking, on the other hand, studies each problem as it relates to the organization’s objectives and interaction with its entire environment, looking at it as a whole within its universe. Taking your organization from a partial systems to a true systems state requires effective strategic management and backward thinking." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

17 June 2006

Yaneer Bar-Yamm - Collected Quotes

"A complex system is a system formed out of many components whose behavior is emergent, that is, the behavior of the system cannot be simply inferred from the behavior of its components. The amount of information necessary to describe the behavior of such a system is a measure of its complexity." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"A dictionary definition of the word ‘complex’ is: ‘consisting of interconnected or interwoven parts’ […] Loosely speaking, the complexity of a system is the amount of information needed in order to describe it. The complexity depends on the level of detail required in the description. A more formal definition can be understood in a simple way. If we have a system that could have many possible states, but we would like to specify which state it is actually in, then the number of binary digits (bits) we need to specify this particular state is related to the number of states that are possible." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"Many of the systems that surround us are complex. The goal of understanding their properties motivates much if not all of scientific inquiry. […] all scientific endeavor is based, to a greater or lesser degree, on the existence of universality, which manifests itself in diverse ways. In this context, the study of complex systems as a new endeavor strives to increase our ability to understand the universality that arises when systems are highly complex." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"There are two approaches to organizing the properties of complex systems that will serve as the foundation of our discussions. The first of these is the relationship between elements, parts and the whole. Since there is only one property of the complex system that we know for sure - that it is complex - the primary question we can ask about this relationship is how the complexity of the whole is related to the complexity of the parts. […] The second approach to the study of complex systems begins from an understanding of the relationship of systems to their descriptions. The central issue is defining quantitatively what we mean by complexity."  (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"When the behavior of the system depends on the behavior of the parts, the complexity of the whole must involve a description of the parts, thus it is large. The smaller the parts that must be described to describe the behavior of the whole, the larger the complexity of the entire system. […] A complex system is a system formed out of many components whose behavior is emergent, that is, the behavior of the system cannot be simply inferred from the behavior of its components." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"A fundamental reason for the difficulties with modern engineering projects is their inherent complexity. The systems that these projects are working with or building have many interdependent parts, so that changes in one part often have effects on other parts of the system. These indirect effects are frequently unanticipated, as are collective behaviors that arise from the mutual interactions of multiple components. Both indirect and collective effects readily cause intolerable failures of the system. Moreover, when the task of the system is intrinsically complex, anticipating the many possible demands that can be placed upon the system, and designing a system that can respond in all of the necessary ways, is not feasible. This problem appears in the form of inadequate specifications, but the fundamental issue is whether it is even possible to generate adequate specifications for a complex system." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"Complex problems are the problems that persist - the problems that bounce back and continue to haunt us. People often go through a series of stages in dealing with such problems - from believing they are beyond hope, to galvanizing collective efforts of many people and dollars to address the problem, to despair, retreat, and rationalization." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"Emergence refers to the relationship between the details of a system and the larger view. Emergence does not emphasize the primary importance of the details or of the larger view; it is concerned with the relationship between the two. Specifically, emergence seeks to discover: Which details are important for the larger view, and which are not? How do collective properties arise from the properties of parts? How does behavior at a larger scale of the system arise from the detailed structure, behavior and relationships on a finer scale?" (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"Engineers use abstraction to simplify the description or specification of the system, extracting the properties of the system they find most relevant and ignoring other details. While this is a useful tool, it assumes that the details that will be provided to one part of the system (module) can be designed independently of details in other parts."  (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"Modularity, an approach that separates a large system into simpler parts that are individually designed and operated, incorrectly assumes that complex system behavior can essentially be reduced to the sum of its parts. A planned decomposition of a system into modules works well for systems that are not too complex. […] However, as systems become more complex, this approach forces engineers to devote increasing attention to designing the interfaces between parts, eventually causing the process to break down."  (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"The basic idea of systems engineering is that it is possible to take a large and highly complex system that one wants to build, separate it into key parts, give the parts to different groups of people to work on, and coordinate their development so that they can be put together at the end of the process. This mechanism is designed to be applied recursively, so that we separate the large system into parts, then the parts into smaller parts, until each part is small enough for one person to execute. Then we put all of the parts together until the entire system works." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"The collapse of a particular project may appear to have a specific cause, but an overly high intrinsic complexity of these systems is a problem common to many of them. A chain always breaks first in one particular link, but if the weight it is required to hold is too high, failure of the chain is guaranteed." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"The complexity of each individual or organization must match the complexity of the task each is to perform. When we think about a highly complex problem, we are generally thinking about tasks that are more complex than a single individual can understand. Otherwise, complexity is not the main issue in solving it. If a problem is more complex than a single individual, the only way to solve it is to have a group of people - organized appropriately - solve it together. When an organization is highly complex it can only function by making sure that each individual does not have to face the complexity of the task of the organization as a whole. Otherwise failure will occur most of the time." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"The complexity of engineering projects has been increasing, but this is not to say that this complexity is new. Engineers and managers are generally aware of the complexity of these projects and have developed systematic techniques that are often useful in addressing it. Notions like modularity, abstraction, hierarchy and layering allow engineers to usefully analyze the complex systems they are working with. At a certain level of interdependence, though, these standard approaches become ineffective." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"The most basic issue for organizational success is correctly matching a system’s complexity to its environment. When we want to accomplish a task, the complexity of the system performing that task must match the complexity of the task. In order to perform the matching correctly, one must recognize that each person has a limited level of complexity. Therefore, tasks become difficult because the complexity of a person is not large enough to handle the complexity of the task. The trick then is to distribute the complexity of the task among many individuals." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"There is a dual nature to engineering. Engineers are responsible for careful quantitative evaluation of how to achieve objectives, what to do to achieve them, and even (a task that most people find almost impossible) how long it will take to do the task. The other side of engineering is an independent creative 'cowboy'-type attitude characteristic of people breaking out of the mold, coming up with novel ideas, implementing them, and changing the world through new technology. This is the culture of high-tech innovation." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"There is no doubt that science has made great progress by taking things apart, but it has become increasingly clear that many important questions can only be addressed by thinking more carefully about relationships between and amongst the parts. Indeed, one of the main difficulties in answering questions or solving problems - any kind of problem - is that we think the problem is in the parts, when it is really in the relationships between them." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"What do people do today when they don’t understand 'the system'? They try to assign responsibility to someone to fix the problem, to oversee 'the system', to coordinate and control what is happening. It is time we recognized that 'the system' is how we work together. When we don’t work together effectively putting someone in charge by its very nature often makes things worse, rather than better, because no one person can understand 'the system' well enough to be responsible. We need to learn how to improve the way we work together, to improve 'the system' without putting someone in charge, in order to make things work." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"What is the solution to coordinating people to perform complex tasks? Analyzing the flows of information and the way tasks are distributed through the system can help. ultimately, however, the best solution is to create an environment where evolution can take place. Organizations that learn by evolutionary change create an environment of ongoing innovation. Evolution by competition and cooperation and the creation of composites of patterns of behavior is the way to synthesize effective systems to meet the complex challenges of today’s world." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004)

"When parts are acting independently, the fine scale behavior is more complex. When they are working together, the fine scale complexity is much smaller, but the behavior is on a larger scale. This means that complexity is always a trade-off, more complex at a large scale means simpler at a fine scale. This trade-off is a basic conceptual tool that we need in order to understand complex systems." (Yaneer Bar-Yam, "Making Things Work: Solving Complex Problems in a Complex World", 2004) 

Fernando J Corbató - Collected Quotes

"Systems with unknown behavioral properties require the implementation of iterations which are intrinsic to the design process but which are normally hidden from view. Certainly when a solution to a well-understood problem is synthesized, weak designs are mentally rejected by a competent designer in a matter of moments. On larger or more complicated efforts, alternative designs must be explicitly and iteratively implemented. The designers perhaps out of vanity, often are at pains to hide the many versions which were abandoned and if absolute failure occurs, of course one hears nothing. Thus the topic of design iteration is rarely discussed. Perhaps we should not be surprised to see this phenomenon with software, for it is a rare author indeed who publicizes the amount of editing or the number of drafts he took to produce a manuscript." (Fernando J Corbató, "A Managerial View of the Multics System Development", 1977)

"Because one has to be an optimist to begin an ambitious project, it is not surprising that underestimation of completion time is the norm." (Fernando J Corbató, "On Building Systems That Will Fail", 1991)

"Design bugs are often subtle and occur by evolution with early assumptions being forgotten as new features or uses are added to systems." (Fernando J Corbató, "On Building Systems That Will Fail", 1991)

"It is important to emphasize the value of simplicity and elegance, for complexity has a way of compounding difficulties and as we have seen, creating mistakes. My definition of elegance is the achievement of a given functionality with a minimum of mechanism and a maximum of clarity." (Fernando J Corbató, "On Building Systems That Will Fail", 1991)

"The value of metaphors should not be underestimated. Metaphors have the virtue of an expected behavior that is understood by all. Unnecessary communication and misunderstandings are reduced. Learning and education are quicker. In effect metaphors are a way of internalizing and abstracting concepts allowing one's thinking to be on a higher plane and low-level mistakes to be avoided." (Fernando J Corbató, "On Building Systems That Will Fail", 1991)

16 June 2006

Karl E Weick - Collected Quotes

"If all of the elements in a large system are loosely coupled to one another, then any one element can adjust to and modify a local a local unique contingency without affecting the whole system. These local adaptations can be swift, relatively economical, and substantial." (Karl E Weick, "Educational organizations as loosely coupled systems", 1976)

"In a loosely coupled system there is more room available for self-determination by the actors. If it is argued that a sense of efficacy is crucial for human beings. when a sense of efficacy might be greater in a loosely coupled system with autonomous units than it would be in a tightly coupled system where discretion is limited." (Karl E Weick, "Educational organizations as loosely coupled systems", 1976)

"Managers construct, rearrange, single out, and demolish many objective features of their surroundings. When people act they unrandomize variables, insert vestiges of orderliness, and literally create their own constraints." (Karl E Weick, "Social Psychology of Organizing", 1979)

"The typical coupling mechanisms of authority of office and logic of the task do not operate in educational organizations." (Karl E Weick, "Educational organizations as loosely coupled systems", 1976)

"Any approach to the study of organizations is built on specific assumptions about the nature of organizations and how they are designed and function." (Richard L Daft & Karl E Weick, "Toward a model of organizations as interpretation systems", Academy of Management Review Vol 9 (2), 1984)

"Action often creates the orderly relations that originally were mere presumptions summarized in a cause map. Thus language trappings of organizations such as strategic plans are important components in the process of creating order. They hold events together long enough and tightly enough in people's heads so that they act in the belief that their actions will be influential and make sense." (Karl E. Weick, "Organizational culture as a source of high reliability", 1987)

"An ordered set of assertions about a generic behavior or structure assumed to hold throughout a significantly broad range of specific instances." (Karl E Weick, "Theory construction as disciplined imagination", 1989)

"Experience is the consequence of activity. The manager literally wades into the swarm of 'events' that surround him and actively tries to unrandomize them and impose some order: The manager acts physically in the environment, attends to some of it, ignores most of it, talks to other people about what they see and are doing."  (Karl E Weick, "Sensemaking in Organizations", 1995)

"Organizations are presumed to talk to themselves over and over to find out what they are thinking." (Karl E Weick, "Sensemaking in Organizations", 1995)

"Sensemaking is about the enlargement of small cues. It is a search for contexts within which small details fit together and make sense. It is people interacting to flesh out hunches. It is a continuous alternation between particulars and explanations with each cycle giving added form and substance to the other." (Karl E Weick, "Sensemaking in Organizations", 1995)

"Sensemaking tends to be swift, which means we are more likely to see products than processes." (Karl E Weick, Sensemaking in Organizations, 1995)

"The organism or group enacts equivocal raw talk, the talk is viewed retrospectively, sense is made of it, and then this sense is stored as knowledge in the retention process. The aim of each process has been to reduce equivocality and to get some idea of what has occurred." (Karl E Weick, "Sensemaking in Organizations", 1995)

"The point we want to make here is that sensemaking is about plausibility, coherence, and reasonableness. Sensemaking is about accounts that are socially acceptable and credible... It would be nice if these accounts were also accurate. But in an equivocal, postmodern world, infused with the politics of interpretation and conflicting interests and inhabited by people with multiple shifting identities, an obsession with accuracy seems fruitless, and not of much practical help, either." (Karl E Weick, "Sensemaking in Organizations", 1995)

"To talk about sensemaking is to talk about reality as an ongoing accomplishment that takes form when people make retrospective sense of the situations in which they find themselves and their creations. There is a strong reflexive quality to this process. People make sense of things by seeing a world on which they already imposed what they believe. In other words, people discover their own inventions. This is why sensemaking can be understood as invention and interpretations understood as discovery. These are complementary ideas. If sensemaking is viewed as an act of invention, then it is also possible to argue that the artifacts it produces include language games and texts." (Karl E Weick, "Sensemaking in Organizations", 1995)

"When people perform an organized action sequence and are interrupted, they try to make sense of it. The longer they search, the higher the arousal, and the stronger the emotion. If the interruption slows the accomplishment of an organized sequence, people are likely to experience anger. If the interruption has accelerated accomplishment, then they are likely to experience pleasure. If people find that the interruption can be circumvented, they experience relief. If they find that the interruption has thwarted a higher level plan, then anger is likely to turn into rage, and if they find that the interruption has thwarted a minor behavioural sequence, they are likely to feel irritated." (Karl E Weick, "Sensemaking in Organizations", 1995)

"The basic idea of sensemaking is that reality is an ongoing accomplishment that emerges from efforts to create order and make retrospective sense of what occurs." (Karl E Weick, "The collapse of sensemaking in organizations: The Mann Gulch disaster", Administrative Science Quarterly 3, 1993)

14 June 2006

Malcolm Gladwell - Collected Quotes

"That is the paradox of the epidemic: that in order to create one contagious movement, you often have to create many small movements first." (Malcolm Gladwell, "The Tipping Point: How Little Things Can Make a Big Difference", 2000)

"The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire." (Malcolm Gladwell, "The Tipping Point: How Little Things Can Make a Big Difference", 2000)

"Every moment – every blink – is composed of a series of discrete moving parts, and every one of those parts offers an opportunity for intervention, for reform, and for correction." (Malcolm Gladwell, "Blink: The Power of Thinking Without Thinking", 2008)

"Often a sign of expertise is noticing what doesn't happen." (Malcolm Gladwell, "Blink: The Power of Thinking Without Thinking", 2008)

"Taking our powers of rapid cognition seriously means we have to acknowledge the subtle influences that can alter or undermine or bias the products of our unconscious." (Malcolm Gladwell, "Blink: The Power of Thinking Without Thinking", 2008)

"The key to good decision making is not knowledge. It is understanding. We are swimming in the former. We are desperately lacking in the latter." (Malcolm Gladwell, "Blink: The Power of Thinking Without Thinking", 2008)

"The values of the world we inhabit and the people we surround ourselves with have a profound effect on who we are." (Malcolm Gladwell, "Outliers: The Story of Success", 2008)

"Those three things - autonomy, complexity, and a connection between effort and reward - are, most people will agree, the three qualities that work has to have if it is to be satisfying." (Malcolm Gladwell, "Outliers: The Story of Success", 2008)

"Truly successful decision-making relies on a balance between deliberate and instinctive thinking." (Malcolm Gladwell, "Blink: The Power of Thinking Without Thinking", 2008)

"We learn by example and by direct experience because there are real limits to the adequacy of verbal instruction." (Malcolm Gladwell, "Blink: The Power of Thinking Without Thinking", 2008)


13 June 2006

Carlos Gershenson - Collected Quotes

"Self-organization can be seen as a spontaneous coordination of the interactions between the components of the system, so as to maximize their synergy. This requires the propagation and processing of information, as different components perceive different aspects of the situation, while their shared goal requires this information to be integrated. The resulting process is characterized by distributed cognition: different components participate in different ways to the overall gathering and processing of information, thus collectively solving the problems posed by any perceived deviation between the present situation and the desired situation." (Carlos Gershenson & Francis Heylighen, "How can we think the complex?", 2004)

[synergy:] "Measure describing how one agent or system increases the satisfaction of other agents or systems." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"The second law of thermodynamics states that in an isolated system, entropy can only increase, not decrease. Such systems evolve to their state of maximum entropy, or thermodynamic equilibrium. Therefore, physical self-organizing systems cannot be isolated: they require a constant input of matter or energy with low entropy, getting rid of the internally generated entropy through the output of heat ('dissipation'). This allows them to produce ‘dissipative structures’ which maintain far from thermodynamic equilibrium. Life is a clear example of order far from thermodynamic equilibrium." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Thus, nonlinearity can be understood as the effect of a causal loop, where effects or outputs are fed back into the causes or inputs of the process. Complex systems are characterized by networks of such causal loops. In a complex, the interdependencies are such that a component A will affect a component B, but B will in general also affect A, directly or indirectly.  A single feedback loop can be positive or negative. A positive feedback will amplify any variation in A, making it grow exponentially. The result is that the tiniest, microscopic difference between initial states can grow into macroscopically observable distinctions." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007) 

"To develop a Control, the designer should find aspect systems, subsystems, or constraints that will prevent the negative interferences between elements (friction) and promote positive interferences (synergy). In other words, the designer should search for ways of minimizing frictions that will result in maximization of the global satisfaction" (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"We have to be aware that even in mathematical and physical models of self-organizing systems, it is the observer who ascribes properties, aspects, states, and probabilities; and therefore entropy or order to the system. But organization is more than low entropy: it is structure that has a function or purpose." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Complexity carries with it a lack of predictability different to that of chaotic systems, i.e. sensitivity to initial conditions. In the case of complexity, the lack of predictability is due to relevant interactions and novel information created by them." (Carlos Gershenson, "Understanding Complex Systems", 2011)

"Complexity has shown that reductionism is limited, in the sense that emergent properties cannot be reduced. In other words, the properties at a given scale cannot be always described completely in terms of properties at a lower scale. This has led people to debate on the reality of phenomena at different scales." (Carlos Gershenson, "Complexity", 2011)

09 June 2006

Scott E Page - Collected Quotes

"Effective models require a real world that has enough structure so that some of the details can be ignored. This implies the existence of solid and stable building blocks that encapsulate key parts of the real system’s behavior. Such building blocks provide enough separation from details to allow modeling to proceed." (John H Miller & Scott E Page, "Complex Adaptive Systems: An Introduction to Computational Models of Social Life", 2007)

"Models need to be judged by what they eliminate as much as by what they include - like stone carving, the art is in removing what you do not need." (John H Miller & Scott E Page, "Complex Adaptive Systems: An Introduction to Computational Models of Social Life", 2007)

"A heuristic is a rule applied to an existing solution represented in a perspective that generates a new (and hopefully better) solution or a new set of possible solutions." (Scott E Page, "The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools and Societies", 2008)

"A perspective is a map from reality to an internal language such that each distinct object, situation, problem, or event gets mapped to a unique word." (Scott E Page, "The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools and Societies", 2008)

"[...] diverse, connected, interdependent entities whose behavior is determined by rules, which may adapt, but need not. The interactions of these entities often produce phenomena that are more than the parts. These phenomena are called emergent." (Scott E Page, "Diversity and Complexity", 2010)

"If we can understand how to leverage diversity to achieve better performance and greater robustness, we might anticipate and prevent collapses." (Scott E Page, "Diversity and Complexity", 2010)

"[…] many-model thinking produces wisdom through a diverse ensemble of logical frames. The various models accentuate different causal forces. Their insights and implications overlap and interweave. By engaging many models as frames, we develop nuanced, deep understandings." (Scott E Page," The Model Thinker", 2018)

“Models are formal structures represented in mathematics and diagrams that help us to understand the world. Mastery of models improves your ability to reason, explain, design, communicate, act, predict, and explore.” (Scott E Page, “The Model Thinker”, 2018)

"Collective intelligence is where the whole is smarter than any one individual in it. You can think of it that in a predictive context, this could be the wisdom of crowds, sort of thing where people guessing the weight of a steer, the crowd’s guess is going to be better than the average guess of the person in it." (Scott E Page [interview])

"Diverse groups of problem solvers outperformed the groups of the best individuals at solving complex problems. The reason: the diverse groups got stuck less often than the smart individuals, who tended to think similarly." (Scott E Page)

Fritjof Capra - Collected Quotes

"The new paradigm may be called a holistic world view, seeing the world as an integrated whole rather than a dissociated collection of parts. It may also be called an ecological view, if the term 'ecological' is used in a much broader and deeper sense than usual. Deep ecological awareness recognizes the fundamental interdependence of all phenomena and the fact that, as individuals and societies we are all embedded in (and ultimately dependent on) the cyclical process of nature." (Fritjof Capra & Gunter A Pauli, "Steering Business Toward Sustainability", 1995)

“[…] self-organization is the spontaneous emergence of new structures and new forms of behavior in open systems far from equilibrium, characterized by internal feedback loops and described mathematically by nonlinear equations.” (Fritjof  Capra, “The web of life: a new scientific understanding of living  systems”, 1996)

"The more complex the network is, the more complex its pattern of interconnections, the more resilient it will be." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"The more we study the major problems of our time, the more we come to realise that they cannot be understood in isolation. They are systemic problems, which means that they are interconnected and interdependent." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"Understanding ecological interdependence means understanding relationships. It requires the shifts of perception that are characteristic of systems thinking - from the parts to the whole, from objects to relationships, from contents to patterns. [...]  Nourishing the community means nourishing those relationships." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"What is sustained in a sustainable community is not economic growth, development, market share, or competitive advantage, but the entire web of life on which our long-term survival depends. In other words, a sustainable community is designed in such a way that its ways of life, businesses, economy, physical structures, and technologies do not interfere with nature’s inherent ability to sustain life." (Fritjof Capra, "Ecoliteracy: The Challenge for Education in the Next Century", 1999)

"One of the key insights of the systems approach has been the realization that the network is a pattern that is common to all life. Wherever we see life, we see networks." (Fritjof Capra, "The Hidden Connections: A Science for Sustainable Living", 2002)

"The phenomenon of emergence takes place at critical points of instability that arise from fluctuations in the environment, amplified by feedback loops." (Fritjof Capra, "The Hidden Connections: A Science for Sustainable Living", 2002)

"This spontaneous emergence of order at critical points of instability, which is often referred to simply as 'emergence', is one of the hallmarks of life. It has been recognized as the dynamic origin of development, learning, and evolution. In other words, creativity - the generation of new forms - is a key property of all living systems." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

08 June 2006

Fred C Scweppe - Collected Quotes

 "A bias can be considered a limiting case of a nonwhite disturbance as a constant is the most time-correlated process possible." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Changes of variables can be helpful for iterative and parametric solutions even if they do not linearize the problem. For example, a change of variables may change the 'shape' of J(x) into a more suitable form. Unfortunately there seems to be no· general way to choose the 'right' change of variables. Success depends on the particular problem and the engineer's insight. However, the possibility of a change of variables should always be considered."(Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Decision-making problems (hypothesis testing) involve situations where it is desired to make a choice among various alternative decisions (hypotheses). Such problems can be viewed as generalized state estimation problems where the definition of state has simply been expanded." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Hypothesis testing can introduce the need for multiple models for the multiple hypotheses and,' if appropriate, a priori probabilities. The one modeling aspect of hypothesis testing that has no estimation counterpart is the problem of specifying the hypotheses to be considered. Often this is a critical step which influences both performance arid the difficulty of implementation." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Modeling is definitely the most important and critical problem. If the mathematical model is not valid, any subsequent analysis, estimation, or control study is meaningless. The development of the model in a convenient form can greatly reduce the complexity of the actual studies." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Pattern recognition can be viewed as a special case of hypothesis testing. In pattern recognition, an observation z is to be used to decide what pattern caused it. Each possible pattern can be viewed as one hypothesis. The main problem in pattern recognition is the development of models for the z corresponding to each pattern (hypothesis)." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"System theory is a tool which engineers use to help them design the 'best' system to do the job that must be done. A dominant characteristic of system theory is the interest in the analysis and design (synthesis) of systems from an input-output point of view. System theory uses mathematical manipulation of a mathematical model to help design the actual system." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The biggest (and sometimes insurmountable) problem is usually to use the available data (information, measurements, etc.) to find out what the system is actually doing (i.e., to estimate its state). If the system's state can be estimated to some reasonable accuracy, the desired control is often obvious (or can be obtained by the use of deterministic control theory)." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The choice of model is often the most critical aspect of a design and development engineering job, but it is impossible to give explicit rules or techniques." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The power and beauty of stochastic approximation theory is that it provides simple, easy to implement gain sequences which guarantee convergence without depending (explicitly) on knowledge of the function to be minimized or the noise properties. Unfortunately, convergence is usually extremely slow. This is to be expected, as "good performance" cannot be expected if no (or very little) knowledge of the nature of the problem is built into the algorithm. In other words, the strength of stochastic approximation (simplicity, little a priori knowledge) is also its weakness." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The pseudo approach to uncertainty modeling refers to the use of an uncertainty model instead of using a deterministic model which is actually (or at least theoretically) available. The uncertainty model may be desired because it results in a simpler analysis, because it is too difficult (expensive) to gather all the data necessary for an exact model, or because the exact model is too complex to be included in the computer." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"[A] system is represented by a mathematical model which may take many forms, such as algebraic equations, finite state machines, difference equations, ordinary differential equations, partial differential equations, and functional equations. The system model may be uncertain, as the mathematical model may not be known completely." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The term hypothesis testing arises because the choice as to which process is observed is based on hypothesized models. Thus hypothesis testing could also be called model testing. Hypothesis testing is sometimes called decision theory. The detection theory of communication theory is a special case." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

07 June 2006

F David Peat - Collected Quotes

"A model is a simplified picture of physical reality; one in which, for example, certain contingencies such as friction, air resistance, and so on have been neglected. This model reproduces within itself some essential feature of the universe. While everyday events in nature are highly contingent and depend upon all sorts of external perturbations and contexts, the idealized model aims to produce the essence of phenomena." (F David Peat, "From Certainty to Uncertainty", 2002)

"A system at a bifurcation point, when pushed slightly, may begin to oscillate. Or the system may flutter around for a time and then revert to its normal, stable behavior. Or, alternatively it may move into chaos. Knowing a system within one range of circumstances may offer no clue as to how it will react in others. Nonlinear systems always hold surprises." (F David Peat, "From Certainty to Uncertainty", 2002)

"A theory makes certain predictions and allows calculations to be made that can be tested directly through experiments and observations. But a theory such as superstrings talks about quantum objects that exist in a multidimensional space and at incredibly short distances. Other grand unified theories would require energies close to those experienced during the creation of the universe to test their predictions." (F David Peat, "From Certainty to Uncertainty", 2002)

"Although the detailed moment-to-moment behavior of a chaotic system cannot be predicted, the overall pattern of its 'random' fluctuations may be similar from scale to scale. Likewise, while the fine details of a chaotic system cannot be predicted one can know a little bit about the range of its 'random' fluctuation." (F David Peat, "From Certainty to Uncertainty", 2002)

"An algorithm is a simple rule, or elementary task, that is repeated over and over again. In this way algorithms can produce structures of astounding complexity." (F David Peat, "From Certainty to Uncertainty", 2002)

"Chaos itself is one form of a wide range of behavior that extends from simple regular order to systems of incredible complexity. And just as a smoothly operating machine can become chaotic when pushed too hard (chaos out of order), it also turns out that chaotic systems can give birth to regular, ordered behavior (order out of chaos). […] Chaos and chance don’t mean the absence of law and order, but rather the presence of order so complex that it lies beyond our abilities to grasp and describe it." (F David Peat, "From Certainty to Uncertainty", 2002)

"Chaos theory explains the ways in which natural and social systems organize themselves into stable entities that have the ability to resist small disturbances and perturbations. It also shows that when you push such a system too far it becomes balanced on a metaphoric knife-edge. Step back and it remains stable; give it the slightest nudge and it will move into a radically new form of behavior such as chaos." (F David Peat, "From Certainty to Uncertainty", 2002)

"Lessons from chaos theory show that energy is always needed for reorganization. And for a new order to appear an organization must be willing to allow a measure of chaos to occur; chaos being that which no one can totally control. It means entering a zone where no one can predict the final outcome or be truly confident as to what will happen." (F David Peat, "From Certainty to Uncertainty", 2002)

"The theories of science are all about idealized models and, in turn, these models give pictures of reality. […] But when we speak of the quantum world we find we are employing concepts that simply do not fit. When we discuss our models of reality we are continually importing ideas that are inappropriate and have no real meaning in the quantum domain." (F David Peat, "From Certainty to Uncertainty", 2002)

"There are endless examples of elaborate structures and apparently complex processes being generated through simple repetitive rules, all of which can be easily simulated on a computer. It is therefore tempting to believe that, because many complex patterns can be generated out of a simple algorithmic rule, all complexity is created in this way." (F David Peat, "From Certainty to Uncertainty", 2002)

"[…] while chaos theory deals in regions of randomness and chance, its equations are entirely deterministic. Plug in the relevant numbers and out comes the answer. In principle at least, dealing with a chaotic system is no different from predicting the fall of an apple or sending a rocket to the moon. In each case deterministic laws govern the system. This is where the chance of chaos differs from the chance that is inherent in quantum theory." (F David Peat, "From Certainty to Uncertainty", 2002)

"While chaos theory is, in the last analysis, no more than a metaphor for human society, it can be a valuable metaphor. It makes us sensitive to the types of organizations we create and the way we deal with the situations that surround us." (F David Peat, "From Certainty to Uncertainty", 2002)

Kevin Kelly - Collected Quotes

"A network nurtures small failures in order that large failures don't happen as often." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"An event is not triggered by a chain of being, but by a field of causes spreading horizontally, like creeping tide." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Artificial complex systems will be deliberately infused with organic principles simply to keep them going." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"At the other far extreme, we find many systems ordered as a patchwork of parallel operations, very much as in the neural network of a brain or in a colony of ants. Action in these systems proceeds in a messy cascade of interdependent events. Instead of the discrete ticks of cause and effect that run a clock, a thousand clock springs try to simultaneously run a parallel system. Since there is no chain of command, the particular action of any single spring diffuses into the whole, making it easier for the sum of the whole to overwhelm the parts of the whole. What emerges from the collective is not a series of critical individual actions but a multitude of simultaneous actions whose collective pattern is far more important. This is the swarm model." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Complexity must be grown from simple systems that already work." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Dumb parts, properly constituted into a swarm, yield smart results." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Evolution is a technological, mathematical, informational, and biological process rolled into one. It could almost be said to be a law of physics, a principle that reigns over all created multitudes, whether they have genes or not." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"If machines knew as much about each other as we know about each other (even in our privacy), the ecology of machines would be indomitable." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"It has long been appreciated by science that large numbers behave differently than small numbers. Mobs breed a requisite measure of complexity for emergent entities. The total number of possible interactions between two or more members accumulates exponentially as the number of members increases. At a high level of connectivity, and a high number of members, the dynamics of mobs takes hold. " (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Swarm systems generate novelty for three reasons: (1) They are 'sensitive to initial conditions' - a scientific shorthand for saying that the size of the effect is not proportional to the size of the cause - so they can make a surprising mountain out of a molehill. (2) They hide countless novel possibilities in the exponential combinations of many interlinked individuals. (3) They don’t reckon individuals, so therefore individual variation and imperfection can be allowed. In swarm systems with heritability, individual variation and imperfection will lead to perpetual novelty, or what we call evolution." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"The central act of the coming era is to connect everything to everything." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"The hardest lesson for humans to learn: that organic complexity will entail organic time." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"The only organization capable of unprejudiced growth, or unguided learning, is a network. All other topologies limit what can happen." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"The work of managing a natural environment is inescapably a work of local knowledge." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995) 

"The world of our own making has become so complicated that we must turn to the world of the born to understand how to manage it." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"To err is human; to manage error is system." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"When everything is connected to everything in a distributed network, everything happens at once. When everything happens at once, wide and fast moving problems simply route around any central authority. Therefore overall governance must arise from the most humble interdependent acts done locally in parallel, and not from a central command. A mob can steer itself, and in the territory of rapid, massive, and heterogeneous change, only a mob can steer. To get something from nothing, control must rest at the bottom within simplicity. " (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"A standalone object, no matter how well designed, has limited potential for new weirdness. A connected object, one that is a node in a network that interacts in some way with other nodes, can give birth to a hundred unique relationships that it never could do while unconnected. Out of this tangle of possible links come myriad new niches for innovations and interactions." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"All things being equal, choose technology that connects. […] This aspect of technology has increasing importance, at times overshadowing such standbys as speed and price. If you are in doubt about what technology to purchase, get the stuff that will connect the most widely, the most often, and in the most ways. Avoid anything that resembles an island, no matter how well endowed that island is." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998) 

"Any network has two ingredients: nodes and connections. In the grand network we are now assembling, the size of the nodes is collapsing while the quantity and quality of the connections are exploding." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"At present, there is far more to be gained by pushing the boundaries of what can be done by the bottom than by focusing on what can be done at the top. When it comes to control, there is plenty of room at the bottom. What we are discovering is that peer-based networks with millions of parts, minimal oversight, and maximum connection among them can do far more than anyone ever expected. We don’t yet know what the limits of decentralization are." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"Don’t solve problems; pursue opportunities. […] In both the short and long term, our ability to solve social and economic problems will be limited primarily to our lack of imagination in seizing opportunities, rather than trying to optimize solutions. There is more to be gained by producing more opportunities than by optimizing existing ones." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998) 

"Mathematics says the sum value of a network increases as the square of the number of members. In other words, as the number of nodes in a network increases arithmetically, the value of the network increases exponentially. Adding a few more members can dramatically increase the value for all members." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"Networks have existed in every economy. What’s different now is that networks, enhanced and multiplied by technology, penetrate our lives so deeply that 'network' has become the central metaphor around which our thinking and our economy are organized. Unless we can understand the distinctive logic of networks, we can’t profit from the economic transformation now under way." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"Technology is no panacea. It will never solve the ills or injustices of society. Technology can do only one thing for us - but it is an astonishing thing: Technology brings us an increase in opportunities." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"The internet model has many lessons for the new economy but perhaps the most important is its embrace of dumb swarm power. The aim of swarm power is superior performance in a turbulent environment. When things happen fast and furious, they tend to route around central control. By interlinking many simple parts into a loose confederation, control devolves from the center to the lowest or outermost points, which collectively keep things on course. A successful system, though, requires more than simply relinquishing control completely to the networked mob." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"The distinguishing characteristic of networks is that they contain no clear center and no clear outside boundaries. Within a network everything is potentially equidistant from everything else." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.