05 December 2014

🕸Systems Engineering: Feedback (Definitions)

"Feedback is the control of a system by reinserting into the system the results of its performance. If these results are merely used as numerical data for criticism of the system and its regulation, we have the simple feedback of the control engineer. If, however, the information which proceeds backwards from the performance is able to change the general method and pattern of the performance, we have a process which may  very well be called learning." (Norbert Wiener, "The Human Use of Human Beings: Cybernetics and Society", 1954)

"In general, this term is used to describe systems or inputs where the current output or state can modify the effect of input. A positive feedback acts as an amplifier or magnifier on the output (e.g., the rich get richer and the poor get poorer). A negative feedback acts to diminish large inputs and magnify small inputs. This becomes important in keeping a system in control or 'on target'." (William J Raynor Jr., "The International Dictionary of Artificial Intelligence", 1999)

"set of signals connected from the output terminals to some input terminals." (Teuvo Kohonen, "Self-Organizing Maps" 3rd Ed., 2001)

"The process in which part of the output of a system is returned to its input in order to regulate its further output. Often this is done intentionally, in order to control the dynamic behavior of the system." (Moti Frank, "Active Learning and Its Implementation for Teaching", 2008)

"The return of a portion of the output of a process or system to the input, especially when used to maintain performance or to control a system or process." (Dino Ruta, "Organizational Implications of Managing the HRIS Employee Experience", 2009)

"Connections that travel backward in a neural network from higher to lower layers creating a loop in the network that allows signals to circulate within it." (Terrence J Sejnowski, "The Deep Learning Revolution", 2018)

"when the effect of a causal impact comes back to influence the original cause of that effect." (David N Ford, "A system dynamics glossary", System Dynamics Review Vol. 35 (4), 2019)

"it is the process that allows to have information of one variable on another or others, during the simulation and not at the end, this allows to adjust on the model to modify, if it were the case, possible decisions that do not affect the desired result by the organization." (Ernesto A Lagarda-Leyva & Ernesto A Vega-Telles, "Application of System Dynamics in a Gasoline Service Station: Decision Making Using Graphical Interface", 2020)

🕸Systems Engineering: Environment (Just the Quotes)

"The change from one stable equilibrium to the other may take place as the result of the isolation of a small unrepresentative group of the population, a temporary change in the environment which alters the relative viability of different types, or in several other ways." (John B S Haldane, "The Causes of Evolution", 1932)

"An isolated system or a system in a uniform environment (which for the present consideration we do best to include as a part of the system we contemplate) increases its entropy and more or less rapidly approaches the inert state of maximum entropy. We now recognize this fundamental law of physics to be just the natural tendency of things to approach the chaotic state (the same tendency that the books of a library or the piles of papers and manuscripts on a writing desk display) unless we obviate it. (The analogue of irregular heat motion, in this case, is our handling those objects now and again without troubling to put them back in their proper places.) (Erwin Schrödinger, "What is Life?", 1944)

"Clearly, if the state of the system is coupled to parameters of an environment and the state of the environment is made to modify parameters of the system, a learning process will occur. Such an arrangement will be called a Finite Learning Machine, since it has a definite capacity. It is, of course, an active learning mechanism which trades with its surroundings. Indeed it is the limit case of a self-organizing system which will appear in the network if the currency supply is generalized." (Gordon Pask, "The Natural History of Networks", 1960)

"Every isolated determinate dynamic system, obeying unchanging laws, will ultimately develop some sort of organisms that are adapted to their environments." (W Ross Ashby, "Principles of the self-organizing system", 1962)

"[...] in a state of dynamic equilibrium with their environments. If they do not maintain this equilibrium they die; if they do maintain it they show a degree of spontaneity, variability, and purposiveness of response unknown in the non-living world. This is what is meant by ‘adaptation to environment’ […] [Its] essential feature […] is stability - that is, the ability to withstand disturbances." (Kenneth Craik, 'Living organisms', "The Nature of Psychology", 1966)

"System theory is basically concerned with problems of relationships, of structure, and of interdependence rather than with the constant attributes of objects. In general approach it resembles field theory except that its dynamics deal with temporal as well as spatial patterns. Older formulations of system constructs dealt with the closed systems of the physical sciences, in which relatively self-contained structures could be treated successfully as if they were independent of external forces. But living systems, whether biological organisms or social organizations, are acutely dependent on their external environment and so must be conceived of as open systems." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"Conventional physics deals only with closed systems, i.e. systems which are considered to be isolated from their environment. [...] However, we find systems which by their very nature and definition are not closed systems. Every living organism is essentially an open system. It maintains itself in a continuous inflow and outflow, a building up and breaking down of components, never being, so long as it is alive, in a state of chemical and thermodynamic equilibrium but maintained in a so-called steady state which is distinct from the latter." (Ludwig von Bertalanffy, "General System Theory", 1968)

"System' is the concept that refers both to a complex of interdependencies between parts, components, and processes, that involves discernible regularities of relationships, and to a similar type of interdependency between such a complex and its surrounding environment." (Talcott Parsons, "Systems Analysis: Social Systems", 1968)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"The main object of cybernetics is to supply adaptive, hierarchical models, involving feedback and the like, to all aspects of our environment. Often such modelling implies simulation of a system where the simulation should achieve the object of copying both the method of achievement and the end result. Synthesis, as opposed to simulation, is concerned with achieving only the end result and is less concerned (or completely unconcerned) with the method by which the end result is achieved. In the case of behaviour, psychology is concerned with simulation, while cybernetics, although also interested in simulation, is primarily concerned with synthesis." (Frank H George, "Soviet Cybernetics, the militairy and Professor Lerner", New Scientist, 1973)

"For any system the environment is always more complex than the system itself. No system can maintain itself by means of a point-for-point correlation with its environment, i.e., can summon enough 'requisite variety' to match its environment. So each one has to reduce environmental complexity - primarily by restricting the environment itself and perceiving it in a categorically preformed way. On the other hand, the difference of system and environment is a prerequisite for the reduction of complexity because reduction can be performed only within the system, both for the system itself and its environment." (Thomas Luckmann & Niklas Luhmann, "The Differentiation of Society", 1977)

"General systems theory and cybernetics supplanted the classical conceptual model of a whole made out of parts and relations between parts with a model emphasizing the difference between systems and environments. This new paradigm made it possible to relate both the structures (including forms of differentiation) and processes of systems to the environment." (Thomas Luckmann & Niklas Luhmann, "The Differentiation of Society", 1977)

"There is a strong current in contemporary culture advocating ‘holistic’ views as some sort of cure-all […] Reductionism implies attention to a lower level while holistic implies attention to higher level. These are intertwined in any satisfactory description: and each entails some loss relative to our cognitive preferences, as well as some gain [...] there is no whole system without an interconnection of its parts and there is no whole system without an environment." (Francisco Varela, "On being autonomous: The lessons of natural history for systems theory", 1977)

"Every system of whatever size must maintain its own structure and must deal with a dynamic environment, i.e., the system must strike a proper balance between stability and change. The cybernetic mechanisms for stability (i.e., homeostasis, negative feedback, autopoiesis, equifinality) and change (i.e., positive feedback, algedonodes, self-organization) are found in all viable systems." (Barry Clemson, "Cybernetics: A New Management Tool", 1984)

"Any system that insulates itself from diversity in the environment tends to atrophy and lose its complexity and distinctive nature." (Gareth Morgan, "Images of Organization", 1986)

"All systems evolve, although the rates of evolution may vary over time both between and within systems. The rate of evolution is a function of both the inherent stability of the system and changing environmental circumstances. But no system can be stabilized forever. For the universe as a whole, an isolated system, time’s arrow points toward greater and greater breakdown, leading to complete molecular chaos, maximum entropy, and heat death. For open systems, including the living systems that are of major interest to us and that interchange matter and energy with their external environments, time’s arrow points to evolution toward greater and greater complexity. Thus, the universe consists of islands of increasing order in a sea of decreasing order. Open systems evolve and maintain structure by exporting entropy to their external environments." (L Douglas Kiel, "Chaos Theory in the Social Sciences: Foundations and Applications", 1996)

"In a closed system, the change in entropy must always be 'positive', meaning toward death. However, in open biological or social systems, entropy can be arrested and may even be transformed into negative entropy - a process of more complete organization and enhanced ability to transform resources. Why? Because the system imports energy and resources from its environment, leading to renewal. This is why education and learning are so important, as they provide new and stimulating input (termed neg-entropy) that can transform each of us." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"Neural networks conserve the complexity of the systems they model because they have complex structures themselves. Neural networks encode information about their environment in a distributed form. […] Neural networks have the capacity to self-organise their internal structure." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)

"All systems have a tendency toward maximum entropy, disorder, and death. Importing resources from the environment is key to long-term viability; closed systems move toward this disorganization faster than open systems." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Systems thinking practices the exact opposite of this analytic approach. Systems thinking studies the organization as a whole in its interaction with its environment. Then, it works backwards to understand how each part of that whole works in relation to, and support of, the entire system’s objectives. Only then can the core strategies be formulated." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Systems, and organizations as systems, can only be understood holistically. Try to understand the system and its environment first. Organizations are open systems and, as such, are viable only in interaction with and adaptation to the changing environment." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The phenomenon of emergence takes place at critical points of instability that arise from fluctuations in the environment, amplified by feedback loops." (Fritjof Capra, "The Hidden Connections: A Science for Sustainable Living", 2002)

"Feedback and its big brother, control theory, are such important concepts that it is odd that they usually find no formal place in the education of physicists. On the practical side, experimentalists often need to use feedback. Almost any experiment is subject to the vagaries of environmental perturbations. Usually, one wants to vary a parameter of interest while holding all others constant. How to do this properly is the subject of control theory. More fundamentally, feedback is one of the great ideas developed (mostly) in the last century, with particularly deep consequences for biological systems, and all physicists should have some understanding of such a basic concept." (John Bechhoefer, "Feedback for physicists: A tutorial essay on control", Reviews of Modern Physics Vol. 77, 2005)

"The single most important property of a cybernetic system is that it is controlled by the relationship between endogenous goals and the external environment. [...] In a complex system, overarching goals may be maintained (or attained) by means of an array of hierarchically organized subgoals that may be pursued contemporaneously, cyclically, or seriatim." (Peter Corning, "Synergy, Cybernetics, and the Evolution of Politics", 2005)

"When defining living systems, the term dynamic equilibrium is essential. It does not imply something which is steady or stable. On the contrary, it is a floating state characterized by invisible movements and preparedness for change. To be in dynamic equilibrium is adapting adjustment to balance. Homeostasis stands for the sum of all control functions creating the state of dynamic equilibrium in a healthy organism. It is the ability of the body to maintain a narrow range of internal conditions in spite of environmental changes." (Lars Skyttner, "General Systems Theory: Problems, Perspective, Practice", 2005)

"Systematic usage of the methods of modern control theory to study physical systems is a key feature of a new research area in physics that may be called cybernetical physics. The subject of cybernetical physics is focused on studying physical systems by means of feedback interactions with the environment. Its methodology heavily relies on the design methods developed in cybernetics. However, the approach of cybernetical physics differs from the conventional use of feedback in control applications (e.g., robotics, mechatronics) aimed mainly at driving a system to a prespecified position or a given trajectory." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"In physical, exponentially growing systems, there must be at least one reinforcing loop driving growth and at least one balancing feedback loop constraining growth, because no system can grow forever in a finite environment." (Donella H Meadows, "Thinking in Systems: A Primer", 2008)

"In that sense, a self-organizing system is intrinsically adaptive: it maintains its basic organization in spite of continuing changes in its environment. As noted, perturbations may even make the system more robust, by helping it to discover a more stable organization." (Francis Heylighen, "Complexity and Self-Organization", 2008)

"If universality is one of the observed characteristics of complex dynamical systems in many fields of study, a second characteristic that flows from the study of these systems is that of emergence. As self-organizing systems go about their daily business, they are constantly exchanging matter and energy with their environment, and this allows them to remain in a state that is far from equilibrium. That allows spontaneous behavior to give rise to new patterns." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"To remedy chaotic situations requires a chaotic approach, one that is non-linear, constantly morphing, and continually sharpening its competitive edge with recurring feedback loops that build upon past experiences and lessons learned. Improvement cannot be sustained without reflection. Chaos arises from myriad sources that stem from two origins: internal chaos rising within you, and external chaos being imposed upon you by the environment. The result of this push/pull effect is the disequilibrium [...]." (Jeff Boss, "Navigating Chaos: How to Find Certainty in Uncertain Situations", 2015)

More quotes on "Environment" at the-web-of-knowledge.blogspot.com.

🕸Systems Engineering: Feedback Loop (Definition)

"A feedback loop is a closed path of causal relations among variables. Feedback loops usually represent the process of monitoring the state of the system, the effects of decisions in the system state, and future decisions." (Luis F Luna-Reyes, "System Dynamics to Understand Public Information Technology", 2008)

"A circular chain of interactions, such that each element in the loop influences its own future level of activation. Feedback loops are also known as circuits." (Elizabeth Santiago-Cortés, "Discrete Networks as a Suitable Approach for the Analysis of Genetic Regulation", 2009)

"A feedback loop is a cycle in a directed graph whose edges can represent either positive or negative inputs." (Maria C A Leite & Yunjiao Wang, "Multistability, oscillations and bifurcations in feedback loops", Mathematical Biosciences and Engineering Vol 7 (1), 2010)

"A linked system of statements in a map in which the arrows show a path of links that feed back to the starting point. A feedback loop exists when the statements around the loop are all variables - that is, they can vary over time, typically increasing or decreasing, or getting better or worse. Feedback loops can be stable or generative (vicious or virtuous)." (Fran Ackermann et al, "Visual Strategy: Strategy Mapping for Public and Nonprofit Organizations", 2014)

[control *:] "A conceptual construct of control theory in which a comparison between a goal state and the measured current state drives a decision-making process for an action to bring the system closer to the goal state. The feedback loop increases the effectiveness of the defensive actions." (O Sami Saydjari, "Engineering Trustworthy Systems: Get Cybersecurity Design Right the First Time", 2018)

"A feedback loop is a sequence of variables and causal links that creates a closed ring of causal influences." (David N Ford, "A system dynamics glossary", System Dynamics Review Vol. 35 (4), 2019)

[balancing *:] "a feedback loop in which the resultant effect of the causal links over time limits or constrains the movement of variables. Balancing loops seek equilibrium, trying to bring stocks to a desired state and keep them there. Also called a negative, compensating, goal-seeking or controlling feedback loop." (David N Ford, "A system dynamics glossary", System Dynamics Review Vol. 35 (4), 2019)

[reinforcing *:] "a feedback loop in which the sum effect of the causal links tends to strengthen (reinforce) the movement of variable values in a given direction due to positive feedback." (David N Ford, "A system dynamics glossary", System Dynamics Review Vol. 35 (4), 2019)

"Is a closed chain pattern of cause and effect reaction connections from a stock, activated by decisions, rules, physical laws, or actions." (Tatiana C Valencia & Stephanie J Valencia, "Cultivating Flow and Happiness in Children", 2020)

"Feedback loop is defined as a system used to control the level of a variable in which there is an identifiable receptor (sensor), control center (integrator or comparator), effectors, and methods of communication." (Lumen Learning, Anatomy and Physiology I [course])

04 December 2014

🕸Systems Engineering: Optimization (Just the Quotes)

"The Systems Engineering method recognizes each system is an integrated whole even though composed of devices, specialized structures and sub-functions. It is further recognized that any system has a number of objectives and that the balance between them may differ widely from system to system. The methods seek to optimize the overall system function according to the weighted objectives and to achieve maximum capability of its parts." (Jack A Morton, "Integrating of Systems Engineering with Component Development", Electrical Manufacturing, 1959)

"The process of formulating and structuring a system are important and creative, since they provide and organize the information, which each system. 'establishes the number of objectives and the balance between them which will be optimized'. Furthermore, they help identify and define the system parts. Furthermore, they help identify and define the system parts which make up its 'diverse, specialized structures and subfunctions'." (Harold Chestnut, "Systems Engineering Tools", 1965)

"The Systems engineering method recognizes each system is an integrated whole even though composed of diverse, specialized structures and sub-functions. It further recognizes that any system has a number of objectives and that the balance between them may differ widely from system to system. The methods seek to optimize the overall system functions according to the weighted objectives and to achieve maximum compatibility of its parts." (Harold Chestnut, "Systems Engineering Tools", 1965)

"Game theory is a collection of mathematical models designed to study situations involving conflict and/or cooperation. It allows for a multiplicity of decision makers who may have different preferences and objectives. Such models involve a variety of different solution concepts concerned with strategic optimization, stability, bargaining, compromise, equity and coalition formation." (Notices of the American Mathematical Society Vol. 26 (1), 1979) 

"Because the individual parts of a complex adaptive system are continually revising their ('conditioned') rules for interaction, each part is embedded in perpetually novel surroundings (the changing behavior of the other parts). As a result, the aggregate behavior of the system is usually far from optimal, if indeed optimality can even be defined for the system as a whole. For this reason, standard theories in physics, economics, and elsewhere, are of little help because they concentrate on optimal end-points, whereas complex adaptive systems 'never get there'. They continue to evolve, and they steadily exhibit new forms of emergent behavior." (John H Holland, "Complex Adaptive Systems", Daedalus Vol. 121 (1), 1992) 

"Mathematical programming (or optimization theory) is that branch of mathematics dealing with techniques for maximizing or minimizing an objective function subject to linear, nonlinear, and integer constraints on the variables."  (George B Dantzig & Mukund N Thapa, "Linear Programming" Vol I, 1997)

"The whole idea of a system is to optimize - not maximize - the fit of its elements in order to maximize the whole. If we merely maximize the elements of systems, we end up suboptimizing the whole [...]" (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"Optimization by individual agents, often used to derive competitive equilibria, are unnecessary for an actual economy to approximately attain such equilibria. From the failure of humans to optimize in complex tasks, one need not conclude that the equilibria derived from the competitive model are descriptively irrelevant. We show that even in complex economic systems, such equilibria can be attained under a range of surprisingly weak assumptions about agent behavior." (Antoni Bosch-Domènech & Shyam Sunder, "Tracking the Invisible Hand", 2000)

"The players in a game are said to be in strategic equilibrium (or simply equilibrium) when their play is mutually optimal: when the actions and plans of each player are rational in the given strategic environment - i. e., when each knows the actions and plans of the others." (Robert Aumann, "War and Peace", 2005)

"Swarm intelligence can be effective when applied to highly complicated problems with many nonlinear factors, although it is often less effective than the genetic algorithm approach [...]. Swarm intelligence is related to swarm optimization […]. As with swarm intelligence, there is some evidence that at least some of the time swarm optimization can produce solutions that are more robust than genetic algorithms. Robustness here is defined as a solution’s resistance to performance degradation when the underlying variables are changed. (Michael J North & Charles M Macal, Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation, 2007)

"Optimization is more than finding the best simulation results. It is itself a complex and evolving field that, subject to certain information constraints, allows data scientists, statisticians, engineers, and traders alike to perform reality checks on modeling results." (Chris Conlan, "Automated Trading with R: Quantitative Research and Platform Development", 2016)

"It is the field of artificial intelligence in which the population is in the form of agents which search in a parallel fashion with multiple initialization points. The swarm intelligence-based algorithms mimic the physical and natural processes for mathematical modeling of the optimization algorithm. They have the properties of information interchange and non-centralized control structure." (Sajad A Rather & P Shanthi Bala, "Analysis of Gravitation-Based Optimization Algorithms for Clustering and Classification", 2020)

More quotes on "Optimization" at the-web-of-knowledge.blogspot.com.

🕸Systems Engineering: Behavior (Just the Quotes)

"[Disorganized complexity] is a problem in which the number of variables is very large, and one in which each of the many variables has a behavior which is individually erratic, or perhaps totally unknown. However, in spite of this helter-skelter, or unknown, behavior of all the individual variables, the system as a whole possesses certain orderly and analyzable average properties. [...] [Organized complexity is] not problems of disorganized complexity, to which statistical methods hold the key. They are all problems which involve dealing simultaneously with a sizable number of factors which are interrelated into an organic whole. They are all, in the language here proposed, problems of organized complexity." (Warren Weaver, "Science and Complexity", American Scientist Vol. 36, 1948)

"The first attempts to consider the behavior of so-called 'random neural nets' in a systematic way have led to a series of problems concerned with relations between the 'structure' and the 'function' of such nets. The 'structure' of a random net is not a clearly defined topological manifold such as could be used to describe a circuit with explicitly given connections. In a random neural net, one does not speak of 'this' neuron synapsing on 'that' one, but rather in terms of tendencies and probabilities associated with points or regions in the net." (Anatol Rapoport, "Cycle distributions in random nets", The Bulletin of Mathematical Biophysics 10(3), 1948)

"Every organism represents a system, by which term we mean a complex of elements in mutual interaction. From this obvious statement the limitations of the analytical and summative conceptions must follow. First, it is impossible to resolve the phenomena of life completely into elementary units; for each individual part and each individual event depends not only on conditions within itself, but also to a greater or lesser extent on the conditions within the whole, or within superordinate units of which it is a part. Hence the behavior of an isolated part is, in general, different from its behavior within the context of the whole. [...] Secondly, the actual whole shows properties that are absent from its isolated parts." (Ludwig von Bertalanffy, "Problems of Life", 1952)

"In our definition of system we noted that all systems have interrelationships between objects and between their attributes. If every part of the system is so related to every other part that any change in one aspect results in dynamic changes in all other parts of the total system, the system is said to behave as a whole or coherently. At the other extreme is a set of parts that are completely unrelated: that is, a change in each part depends only on that part alone. The variation in the set is the physical sum of the variations of the parts. Such behavior is called independent or physical summativity." (Arthur D Hall & Robert E Fagen, "Definition of System", General Systems Vol. 1, 1956)

"Systems engineering is the name given to engineering activity which considers the overall behavior of a system, or more generally which considers all factors bearing on a problem, and the systems approach to control engineering problems is correspondingly that approach which examines the total dynamic behavior of an integrated system. It is concerned more with quality of performance than with sizes, capacities, or efficiencies, although in the most general sense systems engineering is concerned with overall, comprehensive appraisal." (Ernest F Johnson, "Automatic process control", 1958)

"To say a system is 'self-organizing' leaves open two quite different meanings. There is a first meaning that is simple and unobjectionable. This refers to the system that starts with its parts separate (so that the behavior of each is independent of the others' states) and whose parts then act so that they change towards forming connections of some type. Such a system is 'self-organizing' in the sense that it changes from 'parts separated' to 'parts joined'. […] In general such systems can be more simply characterized as 'self-connecting', for the change from independence between the parts to conditionality can always be seen as some form of 'connection', even if it is as purely functional […]" (W Ross Ashby, "Principles of the self-organizing system", 1962)

"Synergy is the only word in our language that means behavior of whole systems unpredicted by the separately observed behaviors of any of the system's separate parts or any subassembly of the system's parts." (R Buckminster Fuller, "Operating Manual for Spaceship Earth", 1963)

"[…] cybernetics studies the flow of information round a system, and the way in which this information is used by the system as a means of controlling itself: it does this for animate and inanimate systems indifferently. For cybernetics is an interdisciplinary science, owing as much to biology as to physics, as much to the study of the brain as to the study of computers, and owing also a great deal to the formal languages of science for providing tools with which the behaviour of all these systems can be objectively described." (A Stafford Beer, 1966)

"We've seen that even in the simplest situations nonlinearities can interfere with a linear approach to aggregates. That point holds in general: nonlinear interactions almost always make the behavior of the aggregate more complicated than would be predicted by summing or averaging." (Lewis Mumford, "The Myth of the Machine" Vol 1, 1967)

"In complex systems cause and effect are often not closely related in either time or space. The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by nonlinear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops. In the complex system the cause of a difficulty may lie far back in time from the symptoms, or in a completely different and remote part of the system. In fact, causes are usually found, not in prior events, but in the structure and policies of the system." (Jay W Forrester, "Urban dynamics", 1969)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"As the complexity of a system increases, our ability to make precise and yet significant statements about its behavior diminishes until a threshold is reached beyond which precision and significance (or relevance) become almost mutually exclusive characteristics." (Lotfi A Zadeh, 1973)

"When a mess, which is a system of problems, is taken apart, it loses its essential properties and so does each of its parts. The behavior of a mess depends more on how the treatment of its parts interact than how they act independently of each other. A partial solution to a whole system of problems is better than whole solutions of each of its parts taken separately." (Russell L Ackoff, "The future of operational research is past", The Journal of the Operational Research Society Vol. 30 (2), 1979)

"Given an approximate knowledge of a system's initial conditions and an understanding of natural law, one can calculate the approximate behavior of the system. This assumption lay at the philosophical heart of science." (James Gleick, Chaos: Making a New Science, 1987)

"Linear relationships are easy to think about: the more the merrier. Linear equations are solvable, which makes them suitable for textbooks. Linear systems have an important modular virtue: you can take them apart and put them together again - the pieces add up. Nonlinear systems generally cannot be solved and cannot be added together. [...] Nonlinearity means that the act of playing the game has a way of changing the rules. [...] That twisted changeability makes nonlinearity hard to calculate, but it also creates rich kinds of behavior that never occur in linear systems." (James Gleick, "Chaos: Making a New Science", 1987)

"Systems thinking is a special form of holistic thinking - dealing with wholes rather than parts. One way of thinking about this is in terms of a hierarchy of levels of biological organization and of the different 'emergent' properties that are evident in say, the whole plant (e.g. wilting) that are not evident at the level of the cell (loss of turgor). It is also possible to bring different perspectives to bear on these different levels of organization. Holistic thinking starts by looking at the nature and behaviour of the whole system that those participating have agreed to be worthy of study. This involves: (i) taking multiple partial views of 'reality' […] (ii) placing conceptual boundaries around the whole, or system of interest and (iii) devising ways of representing systems of interest." (C J Pearson and R L Ison, "Agronomy of Grassland Systems", 1987) 

"The term chaos is used in a specific sense where it is an inherently random pattern of behaviour generated by fixed inputs into deterministic (that is fixed) rules (relationships). The rules take the form of non-linear feedback loops. Although the specific path followed by the behaviour so generated is random and hence unpredictable in the long-term, it always has an underlying pattern to it, a 'hidden' pattern, a global pattern or rhythm. That pattern is self-similarity, that is a constant degree of variation, consistent variability, regular irregularity, or more precisely, a constant fractal dimension. Chaos is therefore order (a pattern) within disorder (random behaviour)." (Ralph D Stacey, "The Chaos Frontier: Creative Strategic Control for Business", 1991)

"Unfortunately, recognizing a system as chaotic will not tell us all that we might like to know. It will not provide us with a means of predicting the future course of the system. It will tell us that there is a limit to how far ahead we can predict, but it may not tell us what this limit is. Perhaps the best advice that chaos 'theory' can give us is not to jump at conclusions; unexpected occurrences may constitute perfectly normal behavior." (Edward N Lorenz, "Chaos, spontaneous climatic variations and detection of the greenhouse effect", 1991)

"Because the individual parts of a complex adaptive system are continually revising their ('conditioned') rules for interaction, each part is embedded in perpetually novel surroundings (the changing behavior of the other parts). As a result, the aggregate behavior of the system is usually far from optimal, if indeed optimality can even be defined for the system as a whole. For this reason, standard theories in physics, economics, and elsewhere, are of little help because they concentrate on optimal end-points, whereas complex adaptive systems 'never get there'. They continue to evolve, and they steadily exhibit new forms of emergent behavior." (John H Holland, "Complex Adaptive Systems", Daedalus Vol. 121 (1), 1992)

"Fundamental to catastrophe theory is the idea of a bifurcation. A bifurcation is an event that occurs in the evolution of a dynamic system in which the characteristic behavior of the system is transformed. This occurs when an attractor in the system changes in response to change in the value of a parameter. A catastrophe is one type of bifurcation. The broader framework within which catastrophes are located is called dynamical bifurcation theory." (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"When the behavior of the system depends on the behavior of the parts, the complexity of the whole must involve a description of the parts, thus it is large. The smaller the parts that must be described to describe the behavior of the whole, the larger the complexity of the entire system. […] A complex system is a system formed out of many components whose behavior is emergent, that is, the behavior of the system cannot be simply inferred from the behavior of its components." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"Modelling techniques on powerful computers allow us to simulate the behaviour of complex systems without having to understand them.  We can do with technology what we cannot do with science.  […] The rise of powerful technology is not an unconditional blessing.  We have  to deal with what we do not understand, and that demands new  ways of thinking." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"It is, however, fair to say that very few applications of swarm intelligence have been developed. One of the main reasons for this relative lack of success resides in the fact that swarm-intelligent systems are hard to 'program', because the paths to problem solving are not predefined but emergent in these systems and result from interactions among individuals and between individuals and their environment as much as from the behaviors of the individuals themselves. Therefore, using a swarm-intelligent system to solve a problem requires a thorough knowledge not only of what individual behaviors must be implemented but also of what interactions are needed to produce such or such global behavior." (Eric Bonabeau et al, "Swarm Intelligence: From Natural to Artificial Systems", 1999)

"With the growing interest in complex adaptive systems, artificial life, swarms and simulated societies, the concept of “collective intelligence” is coming more and more to the fore. The basic idea is that a group of individuals (e. g. people, insects, robots, or software agents) can be smart in a way that none of its members is. Complex, apparently intelligent behavior may emerge from the synergy created by simple interactions between individuals that follow simple rules." (Francis Heylighen, "Collective Intelligence and its Implementation on the Web", 1999)

"Chaos theory reconciles our intuitive sense of free will with the deterministic laws of nature. However, it has an even deeper philosophical ramification. Not only do we have freedom to control our actions, but also the sensitivity to initial conditions implies that even our smallest act can drastically alter the course of history, for better or for worse. Like the butterfly flapping its wings, the results of our behavior are amplified with each day that passes, eventually producing a completely different world than would have existed in our absence!" (Julien C Sprott, "Strange Attractors: Creating Patterns in Chaos", 2000)

"The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire." (Malcolm T Gladwell, "The Tipping Point: How Little Things Can Make a Big Difference", 2000)

"In principle, a self-organising system cannot be constructed, since its organisation and behaviour cannot be prescribed and created by an external source. It emerges autonomously in certain conditions (which cannot be prescribed either). The task of the researcher is to investigate in what kind of systems and under what kind of conditions self-organisation emerges." (Rein Vihalemm, "Chemistry as an Interesting Subject for the Philosophy of Science", 2001)

"A self-organizing system not only regulates or adapts its behavior, it creates its own organization. In that respect it differs fundamentally from our present systems, which are created by their designer. We define organization as structure with function. Structure means that the components of a system are arranged in a particular order. It requires both connections, that integrate the parts into a whole, and separations that differentiate subsystems, so as to avoid interference. Function means that this structure fulfils a purpose." (Francis Heylighen & Carlos Gershenson, "The Meaning of Self-organization in Computing", IEEE Intelligent Systems, 2003)

"Emergence is not really mysterious, although it may be complex. Emergence is brought about by the interactions between the parts of a system. The galloping horse illusion depends upon the persistence of the human retina/brain combination, for instance. Elemental gases bond in combination by sharing outer electrons, thereby altering the appearance and behavior of the combination. In every case of emergence, the source is interaction between the parts - sometimes, as with the brain, very many parts - so that the phenomenon defies simple explanation." (Derek Hitchins, "Advanced Systems Thinking, Engineering and Management", 2003)

"The existence of equilibria or steady periodic solutions is not sufficient to determine if a system will actually behave that way. The stability of these solutions must also be checked. As parameters are changed, a stable motion can become unstable and new solutions may appear. The study of the changes in the dynamic behavior of systems as parameters are varied is the subject of bifurcation theory. Values of the parameters at which the qualitative or topological nature of the motion changes are known as critical or bifurcation values." (Francis C Moona, "Nonlinear Dynamics", 2003)

"This reduction principle - the reduction of the behavior of a complex system to the behavior of its parts - is valid only if the level of complexity of the system is rather low." (Andrzej P Wierzbicki & Yoshiteru Nakamori, "Creative Space: Models of Creative Processes for the Knowledge Civilization Age", Studies in Computational Intelligence Vol.10, 2006)

"How is it that an ant colony can organize itself to carry out the complex tasks of food gathering and nest building and at the same time exhibit an enormous degree of resilience if disrupted and forced to adapt to changing situations? Natural systems are able not only to survive, but also to adapt and become better suited to their environment, in effect optimizing their behavior over time. They seemingly exhibit collective intelligence, or swarm intelligence as it is called, even without the existence of or the direction provided by a central authority." (Michael J North & Charles M Macal, "Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation", 2007)

"In ecology, we are often interested in exploring the behavior of whole systems of species or ecosystem composed of individual components which interact through biological processes. We are interested not simply in the dynamics of each species or component in isolation, but the dynamics of each species or component in the context of all the others and how those coupled dynamics account for properties of the system as a whole, such as its persistence. This is what people seem to mean when they say that ecology is ‘holistic’, an otherwise rather vague term." (John Pastor, "Mathematical Ecology of Populations and Ecosystems", 2008)

"You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays." (Donella H Meadow, "Thinking in Systems: A Primer", 2008)

"Complexity theory can be defined broadly as the study of how order, structure, pattern, and novelty arise from extremely complicated, apparently chaotic systems and conversely, how complex behavior and structure emerges from simple underlying rules. As such, it includes those other areas of study that are collectively known as chaos theory, and nonlinear dynamical theory." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"[...] a high degree of unpredictability is associated with erratic trajectories. This not only because they look random but mostly because infinitesimally small uncertainties on the initial state of the system grow very quickly - actually exponentially fast. In real world, this error amplification translates into our inability to predict the system behavior from the unavoidable imperfect knowledge of its initial state." (Massimo Cencini," Chaos: From Simple Models to Complex Systems", 2010)

"System dynamics is an approach to understanding the behaviour of over time. It deals with internal feedback loops and time delays that affect the behaviour of the entire system. It also helps the decision maker untangle the complexity of the connections between various policy variables by providing a new language and set of tools to describe. Then it does this by modeling the cause and effect relationships among these variables." (Raed M Al-Qirem & Saad G Yaseen, "Modelling a Small Firm in Jordan Using System Dynamics", 2010)

"Complex systems defy intuitive solutions. Even a third-order, linear differential equation is unsolvable by inspection. Yet, important situations in management, economics, medicine, and social behavior usually lose reality if simplified to less than fifth-order nonlinear dynamic systems. Attempts to deal with nonlinear dynamic systems using ordinary processes of description and debate lead to internal inconsistencies. Underlying assumptions may have been left unclear and contradictory, and mental models are often logically incomplete. Resulting behavior is likely to be contrary to that implied by the assumptions being made about' underlying system structure and governing policies." (Jay W Forrester, "Modeling for What Purpose?", The Systems Thinker Vol. 24 (2), 2013)

"Each systems archetype embodies a particular theory about dynamic behavior that can serve as a starting point for selecting and formulating raw data into a coherent set of interrelationships. Once those relationships are made explicit and precise, the "theory" of the archetype can then further guide us in our data-gathering process to test the causal relationships through direct observation, data analysis, or group deliberation." (Daniel H Kim, "Systems Archetypes as Dynamic Theories", The Systems Thinker Vol. 24 (1), 2013)

"Swarm intelligence (SI) is a branch of computational intelligence that discusses the collective behavior emerging within self-organizing societies of agents. SI was inspired by the observation of the collective behavior in societies in nature such as the movement of birds and fish. The collective behavior of such ecosystems, and their artificial counterpart of SI, is not encoded within the set of rules that determines the movement of each isolated agent, but it emerges through the interaction of multiple agents." (Maximos A Kaliakatsos-Papakostas et al, "Intelligent Music Composition", 2013)

"The problem of complexity is at the heart of mankind’s inability to predict future events with any accuracy. Complexity science has demonstrated that the more factors found within a complex system, the more chances of unpredictable behavior. And without predictability, any meaningful control is nearly impossible. Obviously, this means that you cannot control what you cannot predict. The ability ever to predict long-term events is a pipedream. Mankind has little to do with changing climate; complexity does." (Lawrence K Samuels, "The Real Science Behind Changing Climate", LewRockwell.com, August 1, 2014)

"Complex systems are networks made of a number of components that interact with each other, typically in a nonlinear fashion. Complex systems may arise and evolve through self-organization, such that they are neither completely regular nor completely random, permitting the development of emergent behavior at macroscopic scales." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"System dynamics [...] uses models and computer simulations to understand behavior of an entire system, and has been applied to the behavior of large and complex national issues. It portrays the relationships in systems as feedback loops, lags, and other descriptors to explain dynamics, that is, how a system behaves over time. Its quantitative methodology relies on what are called 'stock-and-flow diagrams' that reflect how levels of specific elements accumulate over time and the rate at which they change. Qualitative systems thinking constructs evolved from this quantitative discipline." (Karen L Higgins, "Economic Growth and Sustainability: Systems Thinking for a Complex World", 2015)

"A complex system means a system whose perceived complicated behaviors can be attributed to one or more of the following characteristics: large number of element, large number of relationships among elements, non-linear and discontinuous relationship, and uncertain characteristics of elements." (Chunfang Zhou, "Fostering Creative Problem Solvers in Higher Education: A Response to Complexity of Societies", Handbook of Research on Creative Problem-Solving Skill Development in Higher Education, 2017)

More quotes on "Behavior" at the-web-of-knowledge.blogspot.com

03 December 2014

🕸Systems Engineering: Complexity (Just the Quotes)

"Unity of plan everywhere lies hidden under the mask of diversity of structure - the complex is everywhere evolved out of the simple." (Thomas H Huxley, "A Lobster; or, the Study of Zoology", 1861)

"Simplicity of structure means organic unity, whether the organism be simple or complex; and hence in all times the emphasis which critics have laid upon Simplicity, though they have not unfrequently confounded it with narrowness of range." (George H Lewes, "The Principles of Success in Literature", 1865)

"The first obligation of Simplicity is that of using the simplest means to secure the fullest effect. But although the mind instinctlvely rejects all needless complexity, we shall greatly err if we fail to recognise the fact, that what the mind recoils from is not the complexity, but the needlessness." (George H Lewes, "The Principles of Success in Literature", 1865)

"Man’s mind cannot grasp the causes of events in their completeness, but the desire to find those causes is implanted in man’s soul. And without considering the multiplicity and complexity of the conditions any one of which taken separately may seem to be the cause, he snatches at the first approximation to a cause that seems to him intelligible and says: ‘This is the cause!’" (Leo Tolstoy, "War and Peace", 1867)

"A strict materialist believes that everything depends on the motion of matter. He knows the form of the laws of motion though he does not know all their consequences when applied to systems of unknown complexity." (James C Maxwell, [Letter to Mark Pattison] 1868)

"[…] the simplicity of nature which we at present grasp is really the result of infinite complexity; and that below the uniformity there underlies a diversity whose depths we have not yet probed, and whose secret places are still beyond our reach." (William Spottiswoode, 1879)

"The aim of science is always to reduce complexity to simplicity." (William James, "The Principles of Psychology", 1890)

"The complexity of a system is no guarantee of its accuracy." (John P Jordan, "Cost accounting; principles and practice", 1920)

"[…] to the scientific mind the living and the non-living form one continuous series of systems of differing degrees of complexity […], while to the philosophic mind the whole universe, itself perhaps an organism, is composed of a vast number of interlacing organisms of all sizes." (James G Needham, "Developments in Philosophy of Biology", Quarterly Review of Biology Vol. 3 (1), 1928)

"We love to discover in the cosmos the geometrical forms that exist in the depths of our consciousness. The exactitude of the proportions of our monuments and the precision of our machines express a fundamental character of our mind. Geometry does not exist in the earthly world. It has originated in ourselves. The methods of nature are never so precise as those of man. We do not find in the universe the clearness and accuracy of our thought. We attempt, therefore, to abstract from the complexity of phenomena some simple systems whose components bear to one another certain relations susceptible of being described mathematically." (Alexis Carrel, "Man the Unknown", 1935)

"A material model is the representation of a complex system by a system which is assumed simpler and which is also assumed to have some properties similar to those selected for study in the original complex system. A formal model is a symbolic assertion in logical terms of an idealised relatively simple situation sharing the structural properties of the original factual system." (Arturo Rosenblueth & Norbert Wiener, "The Role of Models in Science", Philosophy of Science Vol. 12 (4), 1945)

"[Disorganized complexity] is a problem in which the number of variables is very large, and one in which each of the many variables has a behavior which is individually erratic, or perhaps totally unknown. However, in spite of this helter-skelter, or unknown, behavior of all the individual variables, the system as a whole possesses certain orderly and analyzable average properties. [...] [Organized complexity is] not problems of disorganized complexity, to which statistical methods hold the key. They are all problems which involve dealing simultaneously with a sizable number of factors which are interrelated into an organic whole. They are all, in the language here proposed, problems of organized complexity." (Warren Weaver, "Science and Complexity", American Scientist Vol. 36, 1948)

"Thus, the central theme that runs through my remarks is that complexity frequently takes the form of hierarchy, and that hierarchic systems have some common properties that are independent of their specific content. Hierarchy, I shall argue, is one of the central structural schemes that the architect of complexity uses." (Herbert Simon, "The Architecture of Complexity", Proceedings of the American Philosophical Society Vol. 106 (6), 1962)

"Only a modern systems approach promises to get the full complexity of the interacting phenomena - to see not only the causes acting on the phenomena under study, the possible consequences of the phenomena and the possible mutual interactions of some of these factors, but also to see the total emergent processes as a function of possible positive and/or negative feedbacks mediated by the selective decisions, or "choices," of the individuals and groups directly involved." (Walter F Buckley,"Sociology and modern systems theory", 1967)

"The fundamental problem today is that of organized complexity. Concepts like those of organization, wholeness, directiveness, teleology, and differentiation are alien to conventional physics. However, they pop up everywhere in the biological, behavioral and social sciences, and are, in fact, indispensable for dealing with living organisms or social groups. Thus a basic problem posed to modern science is a general theory of organization. General system theory is, in principle, capable of giving exact definitions for such concepts and, in suitable cases, of putting them to quantitative analysis." (Ludwig von Bertalanffy, "General System Theory", 1968)

"[…] as a model of a complex system becomes more complete, it becomes less understandable. Alternatively, as a model grows more realistic, it also becomes just as difficult to understand as the real world processes it represents." (Jay M Dutton & William H Starbuck," Computer simulation models of human behavior: A history of an intellectual technology", IEEE Transactions on Systems, 1971)

"[The] system may evolve through a whole succession of transitions leading to a hierarchy of more and more complex and organized states. Such transitions can arise in nonlinear systems that are maintained far from equilibrium: that is, beyond a certain critical threshold the steady-state regime become unstable and the system evolves into a new configuration." (Ilya Prigogine, Gregoire Micolis & Agnes Babloyantz, "Thermodynamics of Evolution", Physics Today 25 (11), 1972)

"The systems view is the emerging contemporary view of organized complexity, one step beyond the Newtonian view of organized simplicity, and two steps beyond the classical world views of divinely ordered or imaginatively envisaged complexity."  (Ervin László, "Introduction to Systems Philosophy", 1972)

"A powerful tool for reducing apparent complexity is recursion. In a recursive procedure, the method of solution is defined in terms of itself. That is, each part of the routine handles only a small piece of the strategy, then calls the other parts of the routine as needed to handle the rest. The trick is to reduce each hard case to one that is handled simply elsewhere." (Brian W Kernighan & Phillip J Plauger, "The Elements of Programming Style", 1974)

"When the operation to be done is more complex, write a separate subroutine or function. The ease of later comprehending, debugging, and changing the program will more than compensate for any overhead caused by adding the extra modules." (Brian W Kernighan & Phillip J Plauger, "The Elements of Programming Style", 1974)

"Controlling complexity is the essence of computer programming." (Brian W Kernighan, "Software Tools", 1976)

"For any system the environment is always more complex than the system itself. No system can maintain itself by means of a point-for-point correlation with its environment, i.e., can summon enough 'requisite variety' to match its environment. So each one has to reduce environmental complexity - primarily by restricting the environment itself and perceiving it in a categorically preformed way. On the other hand, the difference of system and environment is a prerequisite for the reduction of complexity because reduction can be performed only within the system, both for the system itself and its environment." (Thomas Luckmann & Niklas Luhmann, "The Differentiation of Society", 1977)

"Nature is disordered, powerful and chaotic, and through fear of the chaos we impose system on it. We abhor complexity, and seek to simplify things whenever we can by whatever means we have at hand. We need to have an overall explanation of what the universe is and how it functions. In order to achieve this overall view we develop explanatory theories which will give structure to natural phenomena: we classify nature into a coherent system which appears to do what we say it does." (James Burke, "The Day the Universe Changed", 1985) 

"All propaganda or popularization involves a putting of the complex into the simple, but such a move is instantly deconstructive. For if the complex can be put into the simple, then it cannot be as complex as it seemed in the first place; and if the simple can be an adequate medium of such complexity, then it cannot after all be as simple as all that." (Terry Eagleton, "Against The Grain", 1986)

"Any system that insulates itself from diversity in the environment tends to atrophy and lose its complexity and distinctive nature." (Gareth Morgan, "Images of Organization", 1986)

"The hardest problems we have to face do not come from philosophical questions about whether brains are machines or not. There is not the slightest reason to doubt that brains are anything other than machines with enormous numbers of parts that work in perfect accord with physical laws. As far as anyone can tell, our minds are merely complex processes. The serious problems come from our having had so little experience with machines of such complexity that we are not yet prepared to think effectively about them." (Marvin Minsky, 1986)

"Fuzziness, then, is a concomitant of complexity. This implies that as the complexity of a task, or of a system for performing that task, exceeds a certain threshold, the system must necessarily become fuzzy in nature." (Lotfi A Zadeh, "The Birth and Evolution of Fuzzy Logic", 1989)

"We might think of complexity could be regarded as an objective attribute of systems. We might even think we could assign a numerical value to it, making it, for instance, the product of the number of features times the number of interrelationships. If a system had ten variables and five links between them, then its 'complexity quotient', measured in this way would be fifty. If there are no links, its complexity quotient would be zero. Such attempts to measure the complexity of a system have in fact been made." (Dietrich Dorner, "The Logic of Failure: Recognizing and Avoiding Error in Complex Situations", 1989)

"Symmetry breaking in psychology is governed by the nonlinear causality of complex systems (the 'butterfly effect'), which roughly means that a small cause can have a big effect. Tiny details of initial individual perspectives, but also cognitive prejudices, may 'enslave' the other modes and lead to one dominant view." (Klaus Mainzer, "Thinking in Complexity", 1994)

"The impossibility of constructing a complete, accurate quantitative description of a complex system forces observers to pick which aspects of the system they most wish to understand." (Thomas Levenson, "Measure for Measure: A musical history of science", 1994)

"Crude complexity is ‘the length of the shortest message that will describe a system, at a given level of coarse graining, to someone at a distance, employing language, knowledge, and understanding that both parties share (and know they share) beforehand." (Murray Gell-Mann, "What is Complexity?" Complexity Vol. 1 (1), 1995)

"Complexity must be grown from simple systems that already work." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"In constructing a model, we always attempt to maximize its usefulness. This aim is closely connected with the relationship among three key characteristics of every systems model: complexity, credibility, and uncertainty. This relationship is not as yet fully understood. We only know that uncertainty (predictive, prescriptive, etc.) has a pivotal role in any efforts to maximize the usefulness of systems models. Although usually (but not always) undesirable when considered alone, uncertainty becomes very valuable when considered in connection to the other characteristics of systems models: in general, allowing more uncertainty tends to reduce complexity and increase credibility of the resulting model. Our challenge in systems modelling is to develop methods by which an optimal level of allowable uncertainty can be estimated for each modelling problem." (George J Klir & Bo Yuan, "Fuzzy Sets and Fuzzy Logic: Theory and Applications", 1995)

"All systems evolve, although the rates of evolution may vary over time both between and within systems. The rate of evolution is a function of both the inherent stability of the system and changing environmental circumstances. But no system can be stabilized forever. For the universe as a whole, an isolated system, time’s arrow points toward greater and greater breakdown, leading to complete molecular chaos, maximum entropy, and heat death. For open systems, including the living systems that are of major interest to us and that interchange matter and energy with their external environments, time’s arrow points to evolution toward greater and greater complexity. Thus, the universe consists of islands of increasing order in a sea of decreasing order. Open systems evolve and maintain structure by exporting entropy to their external environments." (L Douglas Kiel, "Chaos Theory in the Social Sciences: Foundations and Applications", 1996)

"The more complex the network is, the more complex its pattern of interconnections, the more resilient it will be." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"A dictionary definition of the word ‘complex’ is: ‘consisting of interconnected or interwoven parts’ […] Loosely speaking, the complexity of a system is the amount of information needed in order to describe it. The complexity depends on the level of detail required in the description. A more formal definition can be understood in a simple way. If we have a system that could have many possible states, but we would like to specify which state it is actually in, then the number of binary digits (bits) we need to specify this particular state is related to the number of states that are possible." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"When the behavior of the system depends on the behavior of the parts, the complexity of the whole must involve a description of the parts, thus it is large. The smaller the parts that must be described to describe the behavior of the whole, the larger the complexity of the entire system. […] A complex system is a system formed out of many components whose behavior is emergent, that is, the behavior of the system cannot be simply inferred from the behavior of its components." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"There is no over-arching theory of complexity that allows us to ignore the contingent aspects of complex systems. If something really is complex, it cannot by adequately described by means of a simple theory. Engaging with complexity entails engaging with specific complex systems. Despite this we can, at a very basic level, make general remarks concerning the conditions for complex behaviour and the dynamics of complex systems. Furthermore, I suggest that complex systems can be modelled." (Paul Cilliers,"Complexity and Postmodernism", 1998)

"Complexity is looking at interacting elements and asking how they form patterns and how the patterns unfold. It’s important to point out that the patterns may never be finished. They’re open-ended. In standard science this hit some things that most scientists have a negative reaction to. Science doesn’t like perpetual novelty." (W Brian Arthur, 1999)

"With the growing interest in complex adaptive systems, artificial life, swarms and simulated societies, the concept of 'collective intelligence' is coming more and more to the fore. The basic idea is that a group of individuals (e. g. people, insects, robots, or software agents) can be smart in a way that none of its members is. Complex, apparently intelligent behavior may emerge from the synergy created by simple interactions between individuals that follow simple rules." (Francis Heylighen, "Collective Intelligence and its Implementation on the Web", 1999)

"Much of the art of system dynamics modeling is discovering and representing the feedback processes, which, along with stock and flow structures, time delays, and nonlinearities, determine the dynamics of a system. […] the most complex behaviors usually arise from the interactions (feedbacks) among the components of the system, not from the complexity of the components themselves." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"[…] most earlier attempts to construct a theory of complexity have overlooked the deep link between it and networks. In most systems, complexity starts where networks turn nontrivial." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"[…] networks are the prerequisite for describing any complex system, indicating that complexity theory must inevitably stand on the shoulders of network theory. It is tempting to step in the footsteps of some of my predecessors and predict whether and when we will tame complexity. If nothing else, such a prediction could serve as a benchmark to be disproven. Looking back at the speed with which we disentangled the networks around us after the discovery of scale-free networks, one thing is sure: Once we stumble across the right vision of complexity, it will take little to bring it to fruition. When that will happen is one of the mysteries that keeps many of us going." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"A sudden change in the evolutive dynamics of a system (a ‘surprise’) can emerge, apparently violating a symmetrical law that was formulated by making a reduction on some (or many) finite sequences of numerical data. This is the crucial point. As we have said on a number of occasions, complexity emerges as a breakdown of symmetry (a system that, by evolving with continuity, suddenly passes from one attractor to another) in laws which, expressed in mathematical form, are symmetrical. Nonetheless, this breakdown happens. It is the surprise, the paradox, a sort of butterfly effect that can highlight small differences between numbers that are very close to one another in the continuum of real numbers; differences that may evade the experimental interpretation of data, but that may increasingly amplify in the system’s dynamics." (Cristoforo S Bertuglia & Franco Vaio, "Nonlinearity, Chaos, and Complexity: The Dynamics of Natural and Social Systems", 2003)

"In complexity thinking the darkness principle is covered by the concept of incompressibility [...] The concept of incompressibility suggests that the best representation of a complex system is the system itself and that any representation other than the system itself will necessarily misrepresent certain aspects of the original system." (Kurt Richardson, "Systems theory and complexity: Part 1", Emergence: Complexity & Organization Vol.6 (3), 2004)

"The basic concept of complexity theory is that systems show patterns of organization without organizer (autonomous or self-organization). Simple local interactions of many mutually interacting parts can lead to emergence of complex global structures. […] Complexity originates from the tendency of large dynamical systems to organize themselves into a critical state, with avalanches or 'punctuations' of all sizes. In the critical state, events which would otherwise be uncoupled became correlated." (Jochen Fromm, "The Emergence of Complexity", 2004)

"It is science that brings us an understanding of the true complexity of natural systems. The insights from the science of ecology are teaching us how to work with the checks and balances of nature [...]." (Jamie Goode," The Science of Wine: From Vine to Glass", 2005)

"Complexity arises when emergent system-level phenomena are characterized by patterns in time or a given state space that have neither too much nor too little form. Neither in stasis nor changing randomly, these emergent phenomena are interesting, due to the coupling of individual and global behaviours as well as the difficulties they pose for prediction. Broad patterns of system behaviour may be predictable, but the system's specific path through a space of possible states is not." (Steve Maguire et al, "Complexity Science and Organization Studies", 2006)

"This reduction principle - the reduction of the behavior of a complex system to the behavior of its parts - is valid only if the level of complexity of the system is rather low." (Andrzej P Wierzbicki & Yoshiteru Nakamori, "Creative Space: Models of Creative Processes for the Knowledge Civilization Age", Studies in Computational Intelligence Vol.10, 2006)

"Computer programs are the most complex things that humans make." (Douglas Crockford, "JavaScript: The Good Parts", 2008)

"The addition of new elements or agents to a particular system multiplies exponentially the number of connections or potential interactions among those elements or agents, and hence the number of possible outcomes. This is an important attribute of complexity theory." (Mark Marson, "What Are Its Implications for Educational Change?", 2008)

"Science reveals complexity unfolding in all dimensions and novel features emerging at all scales and organizational levels of the universe. The more we know the more we become aware of how much we do not know. […] Complexity itself is understood as a particular dynamic or 'movement' in time that is simultaneously stable and unstable, predictable and unpredictable, known and unknown, certain and uncertain." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"Most systems in nature are inherently nonlinear and can only be described by nonlinear equations, which are difficult to solve in a closed form. Non-linear systems give rise to interesting phenomena such as chaos, complexity, emergence and self-organization. One of the characteristics of non-linear systems is that a small change in the initial conditions can give rise to complex and significant changes throughout the system. This property of a non-linear system such as the weather is known as the butterfly effect where it is purported that a butterfly flapping its wings in Japan can give rise to a tornado in Kansas. This unpredictable behaviour of nonlinear dynamical systems, i.e. its extreme sensitivity to initial conditions, seems to be random and is therefore referred to as chaos. This chaotic and seemingly random behaviour occurs for non-linear deterministic system in which effects can be linked to causes but cannot be predicted ahead of time." (Robert K Logan, "The Poetry of Physics and The Physics of Poetry", 2010)

"Complexity carries with it a lack of predictability different to that of chaotic systems, i.e. sensitivity to initial conditions. In the case of complexity, the lack of predictability is due to relevant interactions and novel information created by them." (Carlos Gershenson, "Understanding Complex Systems", 2011)

"A self–organizing system acts autonomously, as if the interconnecting components had a single mind. And as these components spontaneously march to the beat of their own drummer, they organize, adapt, and evolve toward a greater complexity than one would ever expect by just looking at the parts by themselves." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"Complexity has the propensity to overload systems, making the relevance of a particular piece of information not statistically significant. And when an array of mind-numbing factors is added into the equation, theory and models rarely conform to reality." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"The problem of complexity is at the heart of mankind's inability to predict future events with any accuracy. Complexity science has demonstrated that the more factors found within a complex system, the more chances of unpredictable behavior. And without predictability, any meaningful control is nearly impossible. Obviously, this means that you cannot control what you cannot predict. The ability ever to predict long-term events is a pipedream. Mankind has little to do with changing climate; complexity does." (Lawrence K Samuels, "The Real Science Behind Changing Climate", 2014)

More on "Complexity" at the-web-of-knowledge.blogspot.com.

🕸Systems Engineering: Control (Just the Quotes)

"Feedback is a method of controlling a system by reinserting into it the results of its past performance. If these results are merely used as numerical data for the criticism of the system and its regulation, we have the simple feedback of the control engineers. If, however, the information which proceeds backward from the performance is able to change the general method and pattern of performance, we have a process which may be called learning." (Norbert Wiener, 1954)

"Systems engineering embraces every scientific and technical concept known, including economics, management, operations, maintenance, etc. It is the job of integrating an entire problem or problem to arrive at one overall answer, and the breaking down of this answer into defined units which are selected to function compatibly to achieve the specified objectives. [...] Instrument and control engineering is but one aspect of systems engineering - a vitally important and highly publicized aspect, because the ability to create automatic controls within overall systems has made it possible to achieve objectives never before attainable, While automatic controls are vital to systems which are to be controlled, every aspect of a system is essential. Systems engineering is unbiased, it demands only what is logically required. Control engineers have been the leaders in pulling together a systems approach in the various technologies." (Instrumentation Technology, 1957)

"Systems engineering is the name given to engineering activity which considers the overall behavior of a system, or more generally which considers all factors bearing on a problem, and the systems approach to control engineering problems is correspondingly that approach which examines the total dynamic behavior of an integrated system. It is concerned more with quality of performance than with sizes, capacities, or efficiencies, although in the most general sense systems engineering is concerned with overall, comprehensive appraisal." (Ernest F Johnson, "Automatic process control", 1958)

"There are two types of systems engineering - basis and applied. [...] Systems engineering is, obviously, the engineering of a system. It usually, but not always, includes dynamic analysis, mathematical models, simulation, linear programming, data logging, computing, optimating, etc., etc. It connotes an optimum method, realized by modern engineering techniques. Basic systems engineering includes not only the control system but also all equipments within the system, including all host equipments for the control system. Applications engineering is - and always has been - all the engineering required to apply the hardware of a hardware manufacturer to the needs of the customer. Such applications engineering may include, and always has included where needed, dynamic analysis, mathematical models, simulation, linear programming, data logging, computing, and any technique needed to meet the end purpose - the fitting of an existing line of production hardware to a customer's needs. This is applied systems engineering." (Instruments and Control Systems Vol. 31, 1958)

"Most of our beliefs about complex organizations follow from one or the other of two distinct strategies. The closed-system strategy seeks certainty by incorporating only those variables positively associated with goal achievement and subjecting them to a monolithic control network. The open-system strategy shifts attention from goal achievement to survival and incorporates uncertainty by recognizing organizational interdependence with environment. A newer tradition enables us to conceive of the organization as an open system, indeterminate and faced with uncertainty, but subject to criteria of rationality and hence needing certainty." (James D Thompson, "Organizations in Action", 1967)

"[…] cybernetics studies the flow of information round a system, and the way in which this information is used by the system as a means of controlling itself: it does this for animate and inanimate systems indifferently. For cybernetics is an interdisciplinary science, owing as much to biology as to physics, as much to the study of the brain as to the study of computers, and owing also a great deal to the formal languages of science for providing tools with which the behaviour of all these systems can be objectively described." (A Stafford Beer, 1966)

"According to the science of cybernetics, which deals with the topic of control in every kind of system (mechanical, electronic, biological, human, economic, and so on), there is a natural law that governs the capacity of a control system to work. It says that the control must be capable of generating as much 'variety' as the situation to be controlled. (A Stafford Beer, "Management Science", 1968)

"The management of a system has to deal with the generation of the plans for the system, i. e., consideration of all of the things we have discussed, the overall goals, the environment, the utilization of resources and the components. The management sets the component goals, allocates the resources, and controls the system performance." (C West Churchman, "The Systems Approach", 1968)

"In complex systems cause and effect are often not closely related in either time or space. The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by nonlinear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops. In the complex system the cause of a difficulty may lie far back in time from the symptoms, or in a completely different and remote part of the system. In fact, causes are usually found, not in prior events, but in the structure and policies of the system." (Jay W Forrester, "Urban dynamics", 1969)

"The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by non-linear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops." (Jay F Forrester, "Urban Dynamics", 1969)

"In self-organizing systems, on the other hand, ‘control’ of the organization is typically distributed over the whole of the system. All parts contribute evenly to the resulting arrangement." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes." (Charles Goodhart, "Problems of Monetary Management: the U.K. Experience", 1975)

"Effect spreads its 'tentacles' not only forwards (as a new cause giving rise to a new effect) but also backwards, to the cause which gave rise to it, thus modifying, exhausting or intensifying its force. This interaction of cause and effect is known as the principle of feedback. It operates everywhere, particularly in all self-organising systems where perception, storing, processing and use of information take place, as for example, in the organism, in a cybernetic device, and in society. The stability, control and progress of a system are inconceivable without feedback." (Alexander Spirkin, "Dialectical Materialism", 1983)

"Ultimately, uncontrolled escalation destroys a system. However, change in the direction of learning, adaptation, and evolution arises from the control of control, rather than unchecked change per se. In general, for the survival and co-evolution of any ecology of systems, feedback processes must be embodied by a recursive hierarchy of control circuits." (Bradford P Keeney, "Aesthetics of Change", 1983)

"The term closed loop-learning process refers to the idea that one learns by determining what s desired and comparing what is actually taking place as measured at the process and feedback for comparison. The difference between what is desired and what is taking place provides an error indication which is used to develop a signal to the process being controlled." (Harold Chestnut, 1984)

"Distributed control means that the outcomes of a complex adaptive system emerge from a process of self-organization rather than being designed and controlled externally or by a centralized body." (Brenda Zimmerman et al, "A complexity science primer", 1998)

"From a more general philosophical perspective we can say that we wish to model complex systems because we want to understand them better.  The main requirement for our models accordingly shifts from having to be correct to being rich in information.  This does not mean that the relationship between the model and the system itself becomes less important, but the shift from control and prediction to understanding does have an effect on our approach to complexity: the evaluation of our models in terms of performance can be deferred. Once we have a better understanding of the dynamics of complexity, we can start looking for the similarities and differences between different complex systems and thereby develop a clearer understanding of the strengths and limitations of different models." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"Cybernetics is the science of effective organization, of control and communication in animals and machines. It is the art of steersmanship, of regulation and stability. The concern here is with function, not construction, in providing regular and reproducible behaviour in the presence of disturbances. Here the emphasis is on families of solutions, ways of arranging matters that can apply to all forms of systems, whatever the material or design employed. [...] This science concerns the effects of inputs on outputs, but in the sense that the output state is desired to be constant or predictable – we wish the system to maintain an equilibrium state. It is applicable mostly to complex systems and to coupled systems, and uses the concepts of feedback and transformations (mappings from input to output) to effect the desired invariance or stability in the result." (Chris Lucas, "Cybernetics and Stochastic Systems", 1999)

"Chaos theory reconciles our intuitive sense of free will with the deterministic laws of nature. However, it has an even deeper philosophical ramification. Not only do we have freedom to control our actions, but also the sensitivity to initial conditions implies that even our smallest act can drastically alter the course of history, for better or for worse. Like the butterfly flapping its wings, the results of our behavior are amplified with each day that passes, eventually producing a completely different world than would have existed in our absence!" (Julien C Sprott, "Strange Attractors: Creating Patterns in Chaos", 2000)

"To develop a Control, the designer should find aspect systems, subsystems, or constraints that will prevent the negative interferences between elements (friction) and promote positive interferences (synergy). In other words, the designer should search for ways of minimizing frictions that will result in maximization of the global satisfaction" (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"In chaotic deterministic systems, the probabilistic description is not linked to the number of degrees of freedom (which can be just one as for the logistic map) but stems from the intrinsic erraticism of chaotic trajectories and the exponential amplification of small uncertainties, reducing the control on the system behavior." (Massimo Cencini et al, "Chaos: From Simple Models to Complex Systems", 2010)

"Cyberneticists argue that positive feedback may be useful, but it is inherently unstable, capable of causing loss of control and runaway. A higher level of control must therefore be imposed upon any positive feedback mechanism: self-stabilising properties of a negative feedback loop constrain the explosive tendencies of positive feedback. This is the starting point of our journey to explore the role of cybernetics in the control of biological growth. That is the assumption that the evolution of self-limitation has been an absolute necessity for life forms with exponential growth." (Tony Stebbing, "A Cybernetic View of Biological Growth: The Maia Hypothesis", 2011)

"Cybernetics is the study of systems which can be mapped using loops (or more complicated looping structures) in the network defining the flow of information. Systems of automatic control will of necessity use at least one loop of information flow providing feedback." (Alan Scrivener, "A Curriculum for Cybernetics and Systems Theory", 2012)

"Without precise predictability, control is impotent and almost meaningless. In other words, the lesser the predictability, the harder the entity or system is to control, and vice versa. If our universe actually operated on linear causality, with no surprises, uncertainty, or abrupt changes, all future events would be absolutely predictable in a sort of waveless orderliness." (Lawrence K Samuels, "Defense of Chaos", 2013)

"The problem of complexity is at the heart of mankind's inability to predict future events with any accuracy. Complexity science has demonstrated that the more factors found within a complex system, the more chances of unpredictable behavior. And without predictability, any meaningful control is nearly impossible. Obviously, this means that you cannot control what you cannot predict. The ability ever to predict long-term events is a pipedream. Mankind has little to do with changing climate; complexity does." (Lawrence K Samuels, "The Real Science Behind Changing Climate", 2014)

"Cybernetics studies the concepts of control and communication in living organisms, machines and organizations including self-organization. It focuses on how a (digital, mechanical or biological) system processes information, responds to it and changes or being changed for better functioning (including control and communication)." (Dmitry A Novikov, "Cybernetics 2.0", 2016)

More quotes on "Control" at the-web-of-knowledge.blogspot.com.

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.