Showing posts with label Systems Engineering. Show all posts
Showing posts with label Systems Engineering. Show all posts

31 December 2025

🕸Systems Engineering: Limits (Just the Quotes)

"Every situation is an equilibrium of forces; every life is a struggle between opposing forces working within the limits of a certain equilibrium." (Henri-Frédéric Amiel, "Amiel's Journal", 1885)

"By some definitions 'systems engineering' is suggested to be a new discovery. Actually it is a common engineering approach which has taken on a new and important meaning because of the greater complexity and scope of problems to be solved in industry, business, and the military. Newly discovered scientific phenomena, new machines and equipment, greater speed of communications, increased production capacity, the demand for control over ever-extending areas under constantly changing conditions, and the resultant complex interactions, all have created a tremendously accelerating need for improved systems engineering. Systems engineering can be complex, but is simply defined as 'logical engineering within physical, economic and technical limits' - bridging the gap from fundamental laws to a practical operating system." (Instrumentation Technology, 1957)

"Clearly, if the state of the system is coupled to parameters of an environment and the state of the environment is made to modify parameters of the system, a learning process will occur. Such an arrangement will be called a Finite Learning Machine, since it has a definite capacity. It is, of course, an active learning mechanism which trades with its surroundings. Indeed it is the limit case of a self-organizing system which will appear in the network if the currency supply is generalized." (Gordon Pask, "The Natural History of Networks", 1960)

"Taking no action to solve these problems is equivalent of taking strong action. Every day of continued exponential growth brings the world system closer to the ultimate limits of that growth. A decision to do nothing is a decision to increase the risk of collapse." (Donella Meadows et al, "The Limits to Growth", 1972) 

"Technology can relieve the symptoms of a problem without affecting the underlying causes. Faith in technology as the ultimate solution to all problems can thus divert our attention from the most fundamental problem - the problem of growth in a finite system." (Donella A Meadows, "The Limits to Growth", 1972)

"Every day of continued exponential growth brings the world system closer to the ultimate limits of that growth." (Mihajlo D Mesarovic, "Mankind at the Turning Point", 1974)

"In a loosely coupled system there is more room available for self-determination by the actors. If it is argued that a sense of efficacy is crucial for human beings. when a sense of efficacy might be greater in a loosely coupled system with autonomous units than it would be in a tightly coupled system where discretion is limited." (Karl E Weick, "Educational organizations as loosely coupled systems", 1976)

"The greater the uncertainty, the greater the amount of decision making and information processing. It is hypothesized that organizations have limited capacities to process information and adopt different organizing modes to deal with task uncertainty. Therefore, variations in organizing modes are actually variations in the capacity of organizations to process information and make decisions about events which cannot be anticipated in advance." (John K Galbraith, "Organization Design", 1977)

"Prediction of the future is possible only in systems that have stable parameters like celestial mechanics. The only reason why prediction is so successful in celestial mechanics is that the evolution of the solar system has ground to a halt in what is essentially a dynamic equilibrium with stable parameters. Evolutionary systems, however, by their very nature have unstable parameters. They are disequilibrium systems and in such systems our power of prediction, though not zero, is very limited because of the unpredictability of the parameters themselves. If, of course, it were possible to predict the change in the parameters, then there would be other parameters which were unchanged, but the search for ultimately stable parameters in evolutionary systems is futile, for they probably do not exist… Social systems have Heisenberg principles all over the place, for we cannot predict the future without changing it." (Kenneth E Boulding, Evolutionary Economics, 1981)

"Prediction of the future is possible only in systems that have stable parameters like celestial mechanics. The only reason why prediction is so successful in celestial mechanics is that the evolution of the solar system has ground to a halt in what is essentially a dynamic equilibrium with stable parameters. Evolutionary systems, however, by their very nature have unstable parameters. They are disequilibrium systems and in such systems our power of prediction, though not zero, is very limited because of the unpredictability of the parameters themselves. If, of course, it were possible to predict the change in the parameters, then there would be other parameters which were unchanged, but the search for ultimately stable parameters in evolutionary systems is futile, for they probably do not exist… Social systems have Heisenberg principles all over the place, for we cannot predict the future without changing it." (Kenneth E Boulding, "Evolutionary Economics", 1981)

"Cellular automata are discrete dynamical systems with simple construction but complex self-organizing behaviour. Evidence is presented that all one-dimensional cellular automata fall into four distinct universality classes. Characterizations of the structures generated in these classes are discussed. Three classes exhibit behaviour analogous to limit points, limit cycles and chaotic attractors. The fourth class is probably capable of universal computation, so that properties of its infinite time behaviour are undecidable." (Stephen Wolfram, "Nonlinear Phenomena, Universality and complexity in cellular automata", Physica 10D, 1984)

"Regarding stability, the state trajectories of a system tend to equilibrium. In the simplest case they converge to one point (or different points from different initial states), more commonly to one" (or several, according to initial state) fixed point or limit cycle(s) or even torus(es) of characteristic equilibrial behaviour. All this is, in a rigorous sense, contingent upon describing a potential, as a special summation of the multitude of forces acting upon the state in question, and finding the fixed points, cycles, etc., to be minima of the potential function. It is often more convenient to use the equivalent jargon of 'attractors' so that the state of a system is 'attracted' to an equilibrial behaviour. In any case, once in equilibrial conditions, the system returns to its limit, equilibrial behaviour after small, arbitrary, and random perturbations." (Gordon Pask, "Different Kinds of Cybernetics", 1992)

"Systems, acting dynamically, produce (and incidentally, reproduce) their own boundaries, as structures which are complementary (necessarily so) to their motion and dynamics. They are liable, for all that, to instabilities chaos, as commonly interpreted of chaotic form, where nowadays, is remote from the random. Chaos is a peculiar situation in which the trajectories of a system, taken in the traditional sense, fail to converge as they approach their limit cycles or 'attractors' or 'equilibria'. Instead, they diverge, due to an increase, of indefinite magnitude, in amplification or gain." (Gordon Pask, "Different Kinds of Cybernetics", 1992)

"As with subtle bifurcations, catastrophes also involve a control parameter. When the value of that parameter is below a bifurcation point, the system is dominated by one attractor. When the value of that parameter is above the bifurcation point, another attractor dominates. Thus the fundamental characteristic of a catastrophe is the sudden disappearance of one attractor and its basin, combined with the dominant emergence of another attractor. Any type of attractor static, periodic, or chaotic can be involved in this. Elementary catastrophe theory involves static attractors, such as points. Because multidimensional surfaces can also attract (together with attracting points on these surfaces), we refer to them more generally as attracting hypersurfaces, limit sets, or simply attractors." (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"In spite of the insurmountable computational limits, we continue to pursue the many problems that possess the characteristics of organized complexity. These problems are too important for our well being to give up on them. The main challenge in pursuing these problems narrows down fundamentally to one question: how to deal with systems and associated problems whose complexities are beyond our information processing limits? That is, how can we deal with these problems if no computational power alone is sufficient? " (George Klir, "Fuzzy sets and fuzzy logic", 1995)

"System dynamics models are not derived statistically from time-series data. Instead, they are statements about system structure and the policies that guide decisions. Models contain the assumptions being made about a system. A model is only as good as the expertise which lies behind its formulation. A good computer model is distinguished from a poor one by the degree to which it captures the essence of a system that it represents. Many other kinds of mathematical models are limited because they will not accept the multiple-feedback-loop and nonlinear nature of real systems." (Jay W Forrester, "Counterintuitive Behavior of Social Systems", 1995)

"The dimensionality and nonlinearity requirements of chaos do not guarantee its appearance. At best, these conditions allow it to occur, and even then under limited conditions relating to particular parameter values. But this does not imply that chaos is rare in the real world. Indeed, discoveries are being made constantly of either the clearly identifiable or arguably persuasive appearance of chaos. Most of these discoveries are being made with regard to physical systems, but the lack of similar discoveries involving human behavior is almost certainly due to the still developing nature of nonlinear analyses in the social sciences rather than the absence of chaos in the human setting. " (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"The only organization capable of unprejudiced growth, or unguided learning, is a network. All other topologies limit what can happen." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"A standalone object, no matter how well designed, has limited potential for new weirdness. A connected object, one that is a node in a network that interacts in some way with other nodes, can give birth to a hundred unique relationships that it never could do while unconnected. Out of this tangle of possible links come myriad new niches for innovations and interactions." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"At present, there is far more to be gained by pushing the boundaries of what can be done by the bottom than by focusing on what can be done at the top. When it comes to control, there is plenty of room at the bottom. What we are discovering is that peer-based networks with millions of parts, minimal oversight, and maximum connection among them can do far more than anyone ever expected. We don’t yet know what the limits of decentralization are." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"Don’t solve problems; pursue opportunities. […] In both the short and long term, our ability to solve social and economic problems will be limited primarily to our lack of imagination in seizing opportunities, rather than trying to optimize solutions. There is more to be gained by producing more opportunities than by optimizing existing ones." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998) 

"Faced with the overwhelming complexity of the real world, time pressure, and limited cognitive capabilities, we are forced to fall back on rote procedures, habits, rules of thumb, and simple mental models to make decisions. Though we sometimes strive to make the best decisions we can, bounded rationality means we often systematically fall short, limiting our ability to learn from experience." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Changing measures are a particularly common problem with comparisons over time, but measures also can cause problems of their own. [...] We cannot talk about change without making comparisons over time. We cannot avoid such comparisons, nor should we want to. However, there are several basic problems that can affect statistics about change. It is important to consider the problems posed by changing - and sometimes unchanging - measures, and it is also important to recognize the limits of predictions. Claims about change deserve critical inspection; we need to ask ourselves whether apples are being compared to apples - or to very different objects." (Joel Best, "Damned Lies and Statistics: Untangling Numbers from the Media, Politicians, and Activists", 2001)

"Limiting factors in population dynamics play the role in ecology that friction does in physics. They stop exponential growth, not unlike the way in which friction stops uniform motion. Whether or not ecology is more like physics in a viscous liquid, when the growth-rate-based traditional view is sufficient, is an open question. We argue that this limit is an oversimplification, that populations do exhibit inertial properties that are noticeable. Note that the inclusion of inertia is a generalization—it does not exclude the regular rate-based, first-order theories. They may still be widely applicable under a strong immediate density dependence, acting like friction in physics." (Lev Ginzburg & Mark Colyvan, "Ecological Orbits: How Planets Move and Populations Grow", 2004)

"It is science that brings us an understanding of the true complexity of natural systems. The insights from the science of ecology are teaching us how to work with the checks and balances of nature, and encouraging a new, rational, limited-input, environmentally sound means of vineyard management that offers a third way between the ideologically driven approach of Biodynamics and conventional chemical-based agricultural systems." (Jamie Goode," The Science of Wine: From Vine to Glass", 2005)

"A great deal of the results in many areas of physics are presented in the form of conservation laws, stating that some quantities do not change during evolution of the system. However, the formulations in cybernetical physics are different. Since the results in cybernetical physics establish how the evolution of the system can be changed by control, they should be formulated as transformation laws, specifying the classes of changes in the evolution of the system attainable by control function from the given class, i.e., specifying the limits of control." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"Humans have difficulty perceiving variables accurately […]. However, in general, they tend to have inaccurate perceptions of system states, including past, current, and future states. This is due, in part, to limited ‘mental models’ of the phenomena of interest in terms of both how things work and how to influence things. Consequently, people have difficulty determining the full implications of what is known, as well as considering future contingencies for potential systems states and the long-term value of addressing these contingencies." (William B. Rouse, "People and Organizations: Explorations of Human-Centered Design", 2007)

"The methodology of feedback design is borrowed from cybernetics (control theory). It is based upon methods of controlled system model’s building, methods of system states and parameters estimation (identification), and methods of feedback synthesis. The models of controlled system used in cybernetics differ from conventional models of physics and mechanics in that they have explicitly specified inputs and outputs. Unlike conventional physics results, often formulated as conservation laws, the results of cybernetical physics are formulated in the form of transformation laws, establishing the possibilities and limits of changing properties of a physical system by means of control." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"[…] our mental models fail to take into account the complications of the real world - at least those ways that one can see from a systems perspective. It is a warning list. Here is where hidden snags lie. You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long-term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays. You are likely to mistreat, misdesign, or misread systems if you don’t respect their properties of resilience, self-organization, and hierarchy." (Donella H Meadows, "Thinking in Systems: A Primer", 2008)

"A quantity growing exponentially toward a limit reaches that limit in a surprisingly short time." (Donella Meadows, "Thinking in systems: A Primer", 2008)

"The other element of systems thinking is learning to influence the system with reinforcing feedback as an engine for growth or decline. [...] Without this kind of understanding, managers will hit blockages in the form of seeming limits to growth and resistance to change because the large complex system will appear impossible to manage. Systems thinking is a significant solution." (Richard L Daft, "The Leadership Experience" 4th Ed., 2008)

"This new model of development would be based clearly on the goal of sustainable human well-being. It would use measures of progress that clearly acknowledge this goal. It would acknowledge the importance of ecological sustainability, social fairness, and real economic efficiency. Ecological sustainability implies recognizing that natural and social capital are not infinitely substitutable for built and human capital, and that real biophysical limits exist to the expansion of the market economy." (Robert Costanza, "Toward a New Sustainable Economy", 2008)

"You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays." (Donella H Meadow, "Thinking in Systems: A Primer", 2008)

"A model is a representation in that it (or its properties) is chosen to stand for some other entity (or its properties), known as the target system. A model is a tool in that it is used in the service of particular goals or purposes; typically these purposes involve answering some limited range of questions about the target system." (Wendy S Parker, "Confirmation and Adequacy-for-Purpose in Climate Modelling", Proceedings of the Aristotelian Society, Supplementary Volumes, Vol. 83, 2009)

"Strange attractors, unlike regular ones, are geometrically very complicated, as revealed by the evolution of a small phase-space volume. For instance, if the attractor is a limit cycle, a small two-dimensional volume does not change too much its shape: in a direction it maintains its size, while in the other it shrinks till becoming a 'very thin strand' with an almost constant length. In chaotic systems, instead, the dynamics continuously stretches and folds an initial small volume transforming it into a thinner and thinner 'ribbon' with an exponentially increasing length." (Massimo Cencini et al, "Chaos: From Simple Models to Complex Systems", 2010)

"The first path of increasing complexity via innovation often faces limits as to how much complexity can be added or reduced in a given system. This is because if you change the complexity level in one place, a compensating change in the opposite direction generally occurs somewhere else." (John L Casti, "X-Events: The Collapse of Everything", 2012)

"Complexity scientists concluded that there are just too many factors - both concordant and contrarian - to understand. And with so many potential gaps in information, almost nobody can see the whole picture. Complex systems have severe limits, not only to predictability but also to measurability. Some complexity theorists argue that modelling, while useful for thinking and for studying the complexities of the world, is a particularly poor tool for predicting what will happen." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

01 December 2025

🕸Systems Engineering: Policies (Just the Quotes)

"[System dynamics] is an approach that should help in important top-management problems [...] The solutions to small problems yield small rewards. Very often the most important problems are but little more difficult to handle than the unimportant. Many [people] predetermine mediocre results by setting initial goals too low. The attitude must be one of enterprise design. The expectation should be for major improvement [...] The attitude that the goal is to explain behavior; which is fairly common in academic circles, is not sufficient. The goal should be to find management policies and organizational structures that lead to greater success." (Jay W Forrester, "Industrial Dynamics", 1961)

"In complex systems cause and effect are often not closely related in either time or space. The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by nonlinear relationships. The complex system is of high order, meaning that there are many system states" (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops. In the complex system the cause of a difficulty may lie far back in time from the symptoms, or in a completely different and remote part of the system. In fact, causes are usually found, not in prior events, but in the structure and policies of the system." (Jay W Forrester, "Urban dynamics", 1969)

"In complex systems cause and effect are often not closely related in either time or space. The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by nonlinear relationships. The complex system is of high order, meaning that there are many system states" (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops. In the complex system the cause of a difficulty may lie far back in time from the symptoms, or in a completely different and remote part of the system. In fact, causes are usually found, not in prior events, but in the structure and policies of the system." (Jay Wright Forrester, "Urban dynamics", 1969)

"A model for simulating dynamic system behavior requires formal policy descriptions to specify how individual decisions are to be made. Flows of information are continuously converted into decisions and actions. No plea about the inadequacy of our understanding of the decision-making processes can excuse us from estimating decision-making criteria. To omit a decision point is to deny its presence - a mistake of far greater magnitude than any errors in our best estimate of the process." (Jay W Forrester,Policies, decisions and information sources for modeling", 1994)

"First, social systems are inherently insensitive to most policy changes that people choose in an effort to alter the behavior of systems. In fact, social systems draw attention to the very points at which an attempt to intervene will fail. Human intuition develops from exposure to simple systems. In simple systems, the cause of a trouble is close in both time and space to symptoms of the trouble. If one touches a hot stove, the burn occurs here and now; the cause is obvious. However, in complex dynamic systems, causes are often far removed in both time and space from the symptoms. True causes may lie far back in time and arise from an entirely different part of the system from when and where the symptoms occur. However, the complex system can mislead in devious ways by presenting an apparent cause that meets the expectations derived from simple systems." (Jay W Forrester, "Counterintuitive Behavior of Social Systems", 1995)

"Second, social systems seem to have a few sensitive influence points through which behavior can be changed. These high-influence points are not where most people expect. Furthermore, when a high-influence policy is identified, the chances are great that a person guided by intuition and judgment will alter the system in the wrong direction." (Jay W Forrester, "Counterintuitive Behavior of Social Systems", 1995)

"System dynamics models are not derived statistically from time-series data. Instead, they are statements about system structure and the policies that guide decisions. Models contain the assumptions being made about a system. A model is only as good as the expertise which lies behind its formulation. A good computer model is distinguished from a poor one by the degree to which it captures the essence of a system that it represents. Many other kinds of mathematical models are limited because they will not accept the multiple-feedback-loop and nonlinear nature of real systems." (Jay W Forrester, "Counterintuitive Behavior of Social Systems", 1995)

"Third, social systems exhibit a conflict between short-term and long-term consequences of a policy change. A policy that produces improvement in the short run is usually one that degrades a system in the long run. Likewise, policies that produce long-run improvement may initially depress behavior of a system. This is especially treacherous. The short run is more visible and more compelling. Short-run pressures speak loudly for immediate attention. However, sequences of actions all aimed at short-run improvement can eventually burden a system with long-run depressants so severe that even heroic short-run measures no longer suffice. Many problems being faced today are the cumulative result of short-run measures taken in prior decades." (Jay W Forrester, "Counterintuitive Behavior of Social Systems", 1995)

"[...] information feedback about the real world not only alters our decisions within the context of existing frames and decision rules but also feeds back to alter our mental models. As our mental models change we change the structure of our systems, creating different decision rules and new strategies. The same information, processed and interpreted by a different decision rule, now yields a different decision. Altering the structure of our systems then alters their patterns of behavior. The development of systems thinking is a double-loop learning process in which we replace a reductionist, narrow, short-run, static view of the world with a holistic, broad, long-term, dynamic view and then redesign our policies and institutions accordingly." (John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000)

"To avoid policy resistance and find high leverage policies requires us to expand the boundaries of our mental models so that we become aware of and understand the implications of the feedbacks created by the decisions we make. That is, we must learn about the structure and dynamics of the increasingly complex systems in which we are embedded." (John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000)

"Deep change in mental models, or double-loop learning, arises when evidence not only alters our decisions within the context of existing frames, but also feeds back to alter our mental models. As our mental models change, we change the structure of our systems, creating different decision rules and new strategies. The same information, interpreted by a different model, now yields a different decision. Systems thinking is an iterative learning process in which we replace a reductionist, narrow, short-run, static view of the world with a holistic, broad, long-term, dynamic view, reinventing our policies and institutions accordingly." (John D Sterman, "Learning in and about complex systems", Systems Thinking Vol. 3 2003)

"System dynamics is an approach to understanding the behaviour of over time. It deals with internal feedback loops and time delays that affect the behaviour of the entire system. It also helps the decision maker untangle the complexity of the connections between various policy variables by providing a new language and set of tools to describe. Then it does this by modeling the cause and effect relationships among these variables." (Raed M Al-Qirem & Saad G Yaseen, "Modelling a Small Firm in Jordan Using System Dynamics", 2010)

"Complex systems defy intuitive solutions. Even a third-order, linear differential equation is unsolvable by inspection. Yet, important situations in management, economics, medicine, and social behavior usually lose reality if simplified to less than fifth-order nonlinear dynamic systems. Attempts to deal with nonlinear dynamic systems using ordinary processes of description and debate lead to internal inconsistencies. Underlying assumptions may have been left unclear and contradictory, and mental models are often logically incomplete. Resulting behavior is likely to be contrary to that implied by the assumptions being made about' underlying system structure and governing policies." (Jay W. Forrester, "Modeling for What Purpose?", The Systems Thinker Vol. 24" (2), 2013)

"One of the remarkable features of these complex systems created by replicator dynamics is that infinitesimal differences in starting positions create vastly different patterns. This sensitive dependence on initial conditions is often called the butterfly - effect aspect of complex systems - small changes in the replicator dynamics or in the starting point can lead to enormous differences in outcome, and they change one’s view of how robust the current reality is. If it is complex, one small change could have led to a reality that is quite different." (David Colander & Roland Kupers, "Complexity and the art of public policy : solving society’s problems from the bottom up", 2014)

"The work around the complex systems map supported a concentration on causal mechanisms. This enabled poor system responses to be diagnosed as the unanticipated effects of previous policies as well as identification of the drivers of the sector. Understanding the feedback mechanisms in play then allowed experimentation with possible future policies and the creation of a coherent and mutually supporting package of recommendations for change. " (David C Lane et al, "Blending systems thinking approaches for organisational analysis: reviewing child protection", 2015)

12 March 2024

🕸Systems Engineering: A Play of Problems (Much Ado about Nothing)

Disclaimer: This post was created just for fun. No problem was hurt or solved in the process! 
Updated: 12-Jun-2024

On Problems

Everybody has at least a problem. If somebody doesn’t have a problem, he’ll make one. If somebody can't make a problem, he can always find a problem. One doesn't need to search long for finding a problem. Looking for a problem one sees more problems. 

Not having a problem can easily become a problem. It’s better to have a problem than none. The none problem is undefinable, which makes it a problem. 

Avoiding a problem might lead you to another problem. Some problems are so old, that's easier to ignore them. 

In every big problem there’s a small problem trying to come out. Most problems can be reduced to smaller problems. A small problem may hide a bigger problem. 

It’s better to solve a problem when is still small, however problems can be perceived only when they grow bigger (big enough). 

In the neighborhood of a problem there’s another problem getting closer. Problems tend to attract each other. 

Between two problems there’s enough place for a third to appear. The shortest path between two problems is another problem. 

Two problems that appear together in successive situations might be the parts of the same problem. 

A problem is more than the sum of its parts.

Any problem can be simplified to the degree that it becomes another problem. 

The complementary of a problem is another problem. At the intersection/reunion of two problems lies another problem.

The inverse of a problem is another problem more complex than the initial problem.

Defining a problem correctly is another problem. A known problem doesn’t make one problem less. 

When a problem seems to be enough, a second appears. A problem never comes alone.  The interplay of the two problems creates a third.

Sharing the problems with somebody else just multiplies the number of problems. 

Problems multiply beyond necessity. Problems multiply beyond our expectations. Problems multiply faster than we can solve them. 

Having more than one problem is for many already too much. Between many big problems and an infinity of problems there seem to be no big difference. 

Many small problems can converge toward a bigger problem. Many small problems can also diverge toward two bigger problems. 

When neighboring problems exist, people tend to isolate them. Isolated problems tend to find other ways to surprise.

Several problems aggregate and create bigger problems that tend to suck within the neighboring problems.

If one waits long enough some problems will solve themselves or it will get bigger. Bigger problems exceed one's area of responsibility. 

One can get credit for a self-created problem. It takes only a good problem to become famous.

A good problem can provide a lifetime. A good problem has the tendency to kick back where it hurts the most. One can fall in love with a good problem. 

One should not theorize before one has a (good) problem. A problem can lead to a new theory, while a theory brings with it many more problems. 

If the only tool you have is a hammer, every problem will look like a nail. (paraphrasing Abraham H Maslow)

Any field of knowledge can be covered by a set of problems. A field of knowledge should be learned by the problems it poses.

A problem thoroughly understood is always fairly simple, but unfairly complex. (paraphrasing Charles F Kettering)

The problem solver created usually the problem. 

Problem Solving

Break a problem in two to solve it easier. Finding how to break a problem is already another problem. Deconstructing a problem to its parts is no guarantee for solving the problem.

Every problem has at least two solutions from which at least one is wrong. It’s easier to solve the wrong problem. 

It’s easier to solve a problem if one knows the solution already. Knowing a solution is not a guarantee for solving the problem.

Sometimes a problem disappears faster than one can find a solution. 

If a problem has two solutions, more likely a third solution exists. 

Solutions can be used to generate problems. The design of a problem seldom lies in its solutions. 

The solution of a problem can create at least one more problem. 

One can solve only one problem at a time. 

Unsolvable problems lead to problematic approximations. There's always a better approximation, one just needs to find it. One needs to be o know when to stop searching for an approximation. 

There's not only a single way for solving a problem. Finding another way for solving a problem provides more insight into the problem. More insight complicates the problem unnecessarily. 

Solving a problem is a matter of perspective. Finding the right perspective is another problem.

Solving a problem is a matter of tools. Searching for the right tool can be a laborious process. 

Solving a problem requires a higher level of consciousness than the level that created it. (see Einstein) With the increase complexity of the problems one an run out of consciousness.

Trying to solve an old problem creates resistance against its solution(s). 

The premature optimization of a problem is the root of all evil. (paraphrasing Donald Knuth)

A great discovery solves a great problem but creates a few others on its way. (paraphrasing George Polya)

Solving the symptoms of a problem can prove more difficult that solving the problem itself.

A master is a person who knows the solutions to his problems. To learn the solutions to others' problems he needs a pupil. 

"The final test of a theory is its capacity to solve the problems which originated it." (George Dantzig) It's easier to theorize if one has a set of problems.

A problem is defined as a gap between where you are and where you want to be, though nobody knows exactly where he is or wants to be.

Complex problems are the problems that persist - so are minor ones.

"The problems are solved, not by giving new information, but by arranging what we have known since long." (Ludwig Wittgenstein, 1953) Some people are just lost in rearranging. 

Solving problems is a practical skill, but impractical endeavor. (paraphrasing George Polya) 

"To ask the right question is harder than to answer it." (Georg Cantor) So most people avoid asking the right question.

Solve more problems than you create.

They Said It

"A great many problems do not have accurate answers, but do have approximate answers, from which sensible decisions can be made." (Berkeley's Law)

"A problem is an opportunity to grow, creating more problems. [...] most important problems cannot be solved; they must be outgrown." (Wayne Dyer)

"A system represents someone's solution to a problem. The system doesn't solve the problem." (John Gall, 1975)

"As long as a branch of science offers an abundance of problems, so long is it alive." (David Hilbert)

"Complex problems have simple, easy to understand, wrong answers." [Grossman's Misquote]

"Every solution breeds new problems." [Murphy's laws]

"Given any problem containing n equations, there will be n+1 unknowns." [Snafu]

"I have not seen any problem, however complicated, which, when you looked at it in the right way, did not become still more complicated." (Paul Anderson)

"If a problem causes many meetings, the meetings eventually become more important than the problem." (Hendrickson’s Law)

"If you think the problem is bad now, just wait until we’ve solved it." (Arthur Kasspe) [Epstein’s Law]

"Inventing is easy for staff outfits. Stating a problem is much harder. Instead of stating problems, people like to pass out half- accurate statements together with half-available solutions which they can't finish and which they want you to finish." [Katz's Maxims]

"It is better to do the right problem the wrong way than to do the wrong problem the right way." (Richard Hamming)

"Most problems have either many answers or no answer. Only a few problems have a single answer." [Berkeley's Law]

"Problems worthy of attack prove their worth by fighting back." (Piet Hein)

Rule of Accuracy: "When working toward the solution of a problem, it always helps if you know the answer."
Corollary: "Provided, of course, that you know there is a problem."

"Some problems are just too complicated for rational logical solutions. They admit of insights, not answers." (Jerome B Wiesner, 1963)

"Sometimes, where a complex problem can be illuminated by many tools, one can be forgiven for applying the one he knows best." [Screwdriver Syndrome]

"The best way to escape from a problem is to solve it." (Brendan Francis)

"The chief cause of problems is solutions." [Sevareid's Law]

"The first step of problem solving is to understand the existing conditions." (Kaoru Ishikawa)

"The human race never solves any of its problems, it only outlives them." (David Gerrold)

"The most fruitful research grows out of practical problems."  (Ralph B Peck)

"The problem-solving process will always break down at the point at which it is possible to determine who caused the problem." [Fyffe's Axiom]

"The worst thing you can do to a problem is solve it completely." (Daniel Kleitman)

"The easiest way to solve a problem is to deny it exists." (Isaac Asimov)

"The solution to a problem changes the problem." [Peers's Law]

"There is a solution to every problem; the only difficulty is finding it." [Evvie Nef's Law]

"There is no mechanical problem so difficult that it cannot be solved by brute strength and ignorance. [William's Law]

"Today's problems come from yesterday’s 'solutions'." (Peter M Senge, 1990)

"While the difficulties and dangers of problems tend to increase at a geometric rate, the knowledge and manpower qualified to deal with these problems tend to increase linearly." [Dror's First Law]

"You are never sure whether or not a problem is good unless you actually solve it." (Mikhail Gromov)

Previous Post <<||>> Next Post

More quotes on Problem solving at QuotableMath.blogpost.com

Resources:
Murphy's laws and corollaries (link)

02 January 2024

🕸Systems Engineering: Never-Ending Stories in Praxis (Quote of the Day)

Systems Engineering
Systems Engineering Cycle

"[…] the longer one works on […] a project without actually concluding it, the more remote the expected completion date becomes. Is this really such a perplexing paradox? No, on the contrary: human experience, all-too-familiar human experience, suggests that in fact many tasks suffer from similar runaway completion times. In short, such jobs either get done soon or they never get done. It is surprising, though, that this common conundrum can be modeled so simply by a self-similar power law." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

I found the above quote while browsing through Manfred Schroeder's book on fractals, chaos and power laws, book that also explores similar topics like percolation, recursion, randomness, self-similarity, determinism, etc. Unfortunately, when one goes beyond the introductory notes of each chapter, the subjects require more advanced knowledge of Mathematics, respectively further analysis and exploration of the models behind. Despite this, the book is still an interesting read with ideas to ponder upon.

I found myself a few times in the situation described above - working on a task that didn't seem to end, despite investing more effort, respectively approaching the solution from different angles. The reasons residing behind such situations were multiple, found typically beyond my direct area of influence and/or decision. In a systemic setup, there are parts of a system that find themselves in opposition, different forces pulling in distinct directions. It can be the case of interests, goals, expectations or solutions which compete or make subject to politics. 

For example, in Data Analytics or Data Science there are high chances that no progress can be made beyond a certain point without addressing first the quality of data or design/architectural issues. The integrations between applications, data migrations and other solutions which heavily rely on data are sensitive to data quality and architecture's reliability. As long the source of variability (data, data generators) is not stabilized, providing a stable solution has low chances of success, no matter how much effort is invested, respectively how performant the tools are. 

Some of the issues can be solved by allocating resources to handle their implications. Unfortunately, some organizations attempt to solve such issues by allocating the resources in the wrong areas or by addressing the symptoms instead of taking a step back and looking systemically at the problem, analyzing and modeling it accordingly. Moreover, there are organizations which refuse to recognize they have a problem at all! In the blame game, it's much easier to shift the responsibility on somebody else's shoulders. 

Defining the right problem to solve might prove more challenging than expected and usually this requires several iterations in which the knowledge obtained in the process is incorporated gradually. Other times, one attempts to solve the correct problem by using the wrong methodology, architecture and/or skillset. The difference between right and wrong depends on the context, and even between similar problems and factors the context can make a considerable difference.

The above quote can be corroborated with situations in which perfection is demanded. In IT and management setups, excellence is often confounded with perfection, the latter being impossible to achieve, though many managers take it as the norm. There's a critical point above which the effort invested outweighs solution's plausibility by an exponential factor.  

Another source for unending effort is when requirements change frequently in a swift manner - e.g. the rate with which changes occur outweighs the progress made for finding a solution. Unless the requirements are stabilized, the effort spirals towards the outside (in an exponential manner). 

Finally, there are cases with extreme character, in which for example the complexity of the task outweighs the skillset and/or the number of resources available. Moreover, there are problems which accept plausible solutions, though there are also problems (especially systemic ones) which don't have stable or plausible solutions. 

Behind most of such cases lie factors that tend to have chaotic behavior that occurs especially when the environments are far from favorable. The models used to depict such relations are nonlinear, sometimes expressed as power laws - one quantity varying as a power of another, with the variation increasing with each generation. 

Previous Post <<||>> Next Post

Resources:
[1] Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990 (quotes)

31 December 2014

🕸Systems Engineering: Idealization (Just the Quotes)

"It is difficult, however, to learn all these things from situations such as occur in everyday life. What we need is a series of abstract and quite impersonal situations to argue about in which one side is surely right and the other surely wrong. The best source of such situations for our purposes is geometry. Consequently we shall study geometric situations in order to get practice in straight thinking and logical argument, and in order to see how it is possible to arrange all the ideas associated with a given subject in a coherent, logical system that is free from contradictions. That is, we shall regard the proof of each proposition of geometry as an example of correct method in argumentation, and shall come to regard geometry as our ideal of an abstract logical system. Later, when we have acquired some skill in abstract reasoning, we shall try to see how much of this skill we can apply to problems from real life." (George D Birkhoff & Ralph Beately, "Basic Geometry", 1940)

"A material model is the representation of a complex system by a system which is assumed simpler and which is also assumed to have some properties similar to those selected for study in the original complex system. A formal model is a symbolic assertion in logical terms of an idealised relatively simple situation sharing the structural properties of the original factual system." (Arturo Rosenblueth & Norbert Wiener, "The Role of Models in Science", Philosophy of Science Vol. 12 (4), 1945)

"In fact, it is empirically ascertainable that every event is actually produced by a number of factors, or is at least accompanied by numerous other events that are somehow connected with it, so that the singling out involved in the picture of the causal chain is an extreme abstraction. Just as ideal objects cannot be isolated from their proper context, material existents exhibit multiple interconnections; therefore the universe is not a heap of things but a system of interacting systems." (Mario Bunge, "Causality: The place of the casual principles in modern science", 1959)

"There is a logic of language and a logic of mathematics. The former is supple and lifelike, it follows our experience. The latter is abstract and rigid, more ideal. The latter is perfectly necessary, perfectly reliable: the former is only sometimes reliable and hardly ever systematic. But the logic of mathematics achieves necessity at the expense of living truth, it is less real than the other, although more certain. It achieves certainty by a flight from the concrete into abstraction." (Thomas Merton, "The Secular Journal of Thomas Merton", 1959)

"[…] if a system is sufficiently complicated, the time it takes to return near a state already visited is huge (think of the hundred fleas on the checkerboard). Therefore if you look at the system for a moderate amount of time, eternal return is irrelevant, and you had better choose another idealization." (David Ruelle, "Chance and Chaos", 1991)

"[…] it does not seem helpful just to say that all models are wrong. The very word model implies simplification and idealization. The idea that complex physical, biological or sociological systems can be exactly described by a few formulae is patently absurd. The construction of idealized representations that capture important stable aspects of such systems is, however, a vital part of general scientific analysis and statistical models, especially substantive ones, do not seem essentially different from other kinds of model." (Sir David Cox, "Comment on ‘Model uncertainty, data mining and statistical inference’", Journal of the Royal Statistical Society, Series A 158, 1995)

"Formulation of a mathematical model is the first step in the process of analyzing the behaviour of any real system. However, to produce a useful model, one must first adopt a set of simplifying assumptions which have to be relevant in relation to the physical features of the system to be modelled and to the specific information one is interested in. Thus, the aim of modelling is to produce an idealized description of reality, which is both expressible in a tractable mathematical form and sufficiently close to reality as far as the physical mechanisms of interest are concerned." (Francois Axisa, "Discrete Systems" Vol. I, 2001)

"A first important remark is that nature gives us mathematical hints. […] A second important remark is that mathematical physics deals with idealized systems. […] The third important remark is that nature may hint at a theorem but does not state clearly under which conditions is true." (David Ruelle, "The Mathematician's Brain", 2007)

"Cellular automata (CA) are idealizations of physical systems in which both space and time are assumed to be discrete and each of the interacting units can have only a finite number of discrete states." (Andreas Schadschneider et al, "Vehicular Traffic II: The Nagel–Schreckenberg Model", 2011)

"Abstract formulations of simply stated concrete ideas are often the result of efforts to create idealized models of complex systems. The models are 'idealized' in the sense that they retain only the most fundamental properties of the original systems. The vocabulary is chosen to be as inclusive as possible so that research into the model reveals facts about a wide variety of similar systems. Unfortunately, it is often the case that over time the connection between a model and the systems on which it was based is lost, and the interested reader is faced with something that looks as if it were created to be deliberately complicated - deliberately confusing - but the original intention was just the opposite. Often, the model was devised to be simpler and more transparent than any of the systems on which it was based." (John Tabak, "Beyond Geometry: A new mathematics of space and form", 2011)

"Stated loosely, models are simplified, idealized and approximate representations of the structure, mechanism and behavior of real-world systems. From the standpoint of set-theoretic model theory, a mathematical model of a target system is specified by a nonempty set - called the model’s domain, endowed with some operations and relations, delineated by suitable axioms and intended empirical interpretation." (Zoltan Domotor, "Mathematical Models in Philosophy of Science" [Mathematics of Complexity and Dynamical Systems, 2012])

30 December 2014

🕸Systems Engineering: Information Theory (Just the Quotes)

"[…] information theory is characterised essentially by its dealing always with a set of possibilities; both its primary data and its final statements are almost always about the set as such, and not about some individual element in the set." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"The general notion in communication theory is that of information. In many cases, the flow of information corresponds to a flow of energy, e. g. if light waves emitted by some objects reach the eye or a photoelectric cell, elicit some reaction of the organism or some machinery, and thus convey information." (Ludwig von Bertalanffy, "General System Theory", 1968) 

"The 'flow of information' through human communication channels is enormous. So far no theory exists, to our knowledge, which attributes any sort of unambiguous measure to this 'flow'." (Anatol Rapoport, "Modern Systems Research for the Behavioral Scientist", 1969)

"Probability plays a central role in many fields, from quantum mechanics to information theory, and even older fields use probability now that the presence of 'noise' is officially admitted. The newer aspects of many fields start with the admission of uncertainty." (Richard Hamming, "Methods of Mathematics Applied to Calculus, Probability, and Statistics", 1985)

"The field of 'information theory' began by using the old hardware paradigm of transportation of data from point to point." (Marshall McLuhan & Eric McLuhan, Laws of Media: The New Science, 1988)

"Without an understanding of causality there can be no theory of communication. What passes as information theory today is not communication at all, but merely transportation." (Marshall McLuhan & Eric McLuhan, "Laws of Media: The New Science", 1988)

"If quantum communication and quantum computation are to flourish, a new information theory will have to be developed." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"In fact, an information theory that leaves out the issue of noise turns out to have no content." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"In an information economy, entrepreneurs master the science of information in order to overcome the laws of the purely physical sciences. They can succeed because of the surprising power of the laws of information, which are conducive to human creativity. The central concept of information theory is a measure of freedom of choice. The principle of matter, on the other hand, is not liberty but limitation - it has weight and occupies space." (George Gilder, "Knowledge and Power: The Information Theory of Capitalism and How it is Revolutionizing our World", 2013)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

29 December 2014

🕸Systems Engineering: Cognitive Maps (Just the Quotes)

"[…] learning consists not in stimulus-response connections but in the building up in the nervous system of sets which function like cognitive maps […] such cognitive maps may be usefully characterized as varying from a narrow strip variety to a broader comprehensive variety." (Edward C Tolman, "Cognitive maps in rats and men", 1948)

"A person is changed by the contingencies of reinforcement under which he behaves; he does not store the contingencies. In particular, he does not store copies of the stimuli which have played a part in the contingencies. There are no 'iconic representations' in his mind; there are no 'data structures stored in his memory'; he has no 'cognitive map' of the world in which he has lived. He has simply been changed in such a way that stimuli now control particular kinds of perceptual behavior." (Burrhus F Skinner, "About behaviorism", 1974)

"A cognitive map is a specific way of representing a person's assertions about some limited domain, such as a policy problem. It is designed to capture the structure of the person's causal assertions and to generate the consequences that follow front this structure. […]  a person might use his cognitive map to derive explanations of the past, make predictions for the future, and choose policies in the present." (Robert M Axelrod, "Structure of Decision: The cognitive maps of political elites", 1976)

"The concepts a person uses are represented as points, and the causal links between these concepts are represented as arrows between these points. This gives a pictorial representation of the causal assertions of a person as a graph of points and arrows. This kind of representation of assertions as a graph will be called a cognitive map. The policy alternatives, all of the various causes and effects, the goals, and the ultimate utility of the decision maker can all be thought of as concept variables, and represented as points in the cognitive map. The real power of this approach ap pears when a cognitive map is pictured in graph form; it is then relatively easy to see how each of the concepts and causal relation ships relate to each other, and to see the overall structure of the whole set of portrayed assertions." (Robert Axelrod, "The Cognitive Mapping Approach to Decision Making" [in "Structure of Decision: The Cognitive Maps of Political Elites"], 1976)

"The cognitive map is not a picture or image which 'looks like' what it represents; rather, it is an information structure from which map-like images can be reconstructed and from which behaviour dependent upon place information can be generated." (John O'Keefe & Lynn Nadel, "The Hippocampus as a Cognitive Map", 1978)

"A fuzzy cognitive map or FCM draws a causal picture. It ties facts and things and processes to values and policies and objectives. And it lets you predict how complex events interact and play out. [...] Neural nets give a shortcut to tuning an FCM. The trick is to let the fuzzy causal edges change as if they were synapses in a neural net. They cannot change with the same math laws because FCM edges stand for causal effect not signal flow. We bombard the FCM nodes with real data. The data state which nodes are on or off and to which degree at each moment in time. Then the edges grow among the nodes."  (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Under the label 'cognitive maps', mental models have been conceived of as the mental representation of spatial aspects of the environment. A mental model, in this sense, comprises the topology of an area, including relevant districts, landmarks, and paths." (Gert Rickheit & Lorenz Sichelschmidt, "Mental Models: Some Answers, Some Questions, Some Suggestions", 1999)

"Bounded rationality simultaneously constrains the complexity of our cognitive maps and our ability to use them to anticipate the system dynamics. Mental models in which the world is seen as a sequence of events and in which feedback, nonlinearity, time delays, and multiple consequences are lacking lead to poor performance when these elements of dynamic complexity are present. Dysfunction in complex systems can arise from the misperception of the feedback structure of the environment. But rich mental models that capture these sources of complexity cannot be used reliably to understand the dynamics. Dysfunction in complex systems can arise from faulty mental simulation-the misperception of feedback dynamics. These two different bounds on rationality must both be overcome for effective learning to occur. Perfect mental models without a simulation capability yield little insight; a calculus for reliable inferences about dynamics yields systematically erroneous results when applied to simplistic models." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Even if our cognitive maps of causal structure were perfect, learning, especially double-loop learning, would still be difficult. To use a mental model to design a new strategy or organization we must make inferences about the consequences of decision rules that have never been tried and for which we have no data. To do so requires intuitive solution of high-order nonlinear differential equations, a task far exceeding human cognitive capabilities in all but the simplest systems."  (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"The robustness of the misperceptions of feedback and the poor performance they cause are due to two basic and related deficiencies in our mental model. First, our cognitive maps of the causal structure of systems are vastly simplified compared to the complexity of the systems themselves. Second, we are unable to infer correctly the dynamics of all but the simplest causal maps. Both are direct consequences of bounded rationality, that is, the many limitations of attention, memory, recall, information processing capability, and time that constrain human decision making." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Eliciting and mapping the participant's mental models, while necessary, is far from sufficient [...] the result of the elicitation and mapping process is never more than a set of causal attributions, initial hypotheses about the structure of a system, which must then be tested. Simulation is the only practical way to test these models. The complexity of the cognitive maps produced in an elicitation workshop vastly exceeds our capacity to understand their implications. Qualitative maps are simply too ambiguous and too difficult to simulate mentally to provide much useful information on the adequacy of the model structure or guidance about the future development of the system or the effects of policies." (John D Sterman, "Learning in and about complex systems", Systems Thinking Vol. 3 2003)

"When an individual uses causal mapping to help clarify their own thinking, we call this technique cognitive mapping, because it is related to personal thinking or cognition. When a group maps their own ideas, we call it oval mapping, because we often use oval-shaped cards to record individuals’ ideas so that they can be arranged into a group’s map. Cognitive maps and oval maps can be used to create a strategic plan, because the maps include goals, strategies and actions, just like strategic plans." (John M Bryson et al, "Visible Thinking: Unlocking Causal Mapping For Practical Business Results", 2004)

27 December 2014

🕸Systems Engineering: Limitations (Just the Quotes)

"From a more general philosophical perspective we can say that we wish to model complex systems because we want to understand them better.  The main requirement for our models accordingly shifts from having to be correct to being rich in information.  This does not mean that the relationship between the model and the system itself becomes less important, but the shift from control and prediction to understanding does have an effect on our approach to complexity: the evaluation of our models in terms of performance can be deferred. Once we have a better understanding of the dynamics of complexity, we can start looking for the similarities and differences between different complex systems and thereby develop a clearer understanding of the strengths and limitations of different models." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"The robustness of the misperceptions of feedback and the poor performance they cause are due to two basic and related deficiencies in our mental model. First, our cognitive maps of the causal structure of systems are vastly simplified compared to the complexity of the systems themselves. Second, we are unable to infer correctly the dynamics of all but the simplest causal maps. Both are direct consequences of bounded rationality, that is, the many limitations of attention, memory, recall, information processing capability, and time that constrain human decision making." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"The very essence of mass communication theory is a simple but all-embracing expression of technological determinism, since the essential features depend on what certain technologies have made possible, certain technologies have made possible, especially the following: communication at a distance, the multiplication and simultaneous distribution of diverse ‘messages’, the enormous capacity and speed of carriers, and the limitations on response. There is no escaping the implication that public communication as practised in modern societies is profoundly shaped by these general features." (Denis McQuail, "McQuail's Reader in Mass Communication Theory", 2002)

"A characteristic of such chaotic dynamics is an extreme sensitivity to initial conditions (exponential separation of neighboring trajectories), which puts severe limitations on any forecast of the future fate of a particular trajectory. This sensitivity is known as the ‘butterfly effect’: the state of the system at time t can be entirely different even if the initial conditions are only slightly changed, i.e., by a butterfly flapping its wings." (Hans J Korsch et al, "Chaos: A Program Collection for the PC", 2008)

"We are beginning to see the entire universe as a holographically interlinked network of energy and information, organically whole and self-referential at all scales of its existence. We, and all things in the universe, are non-locally connected with each other and with all other things in ways that are unfettered by the hitherto known limitations of space and time." (Ervin László, "Cosmos: A Co-creator's Guide to the Whole-World", 2010)

"Cyberneticists argue that positive feedback may be useful, but it is inherently unstable, capable of causing loss of control and runaway. A higher level of control must therefore be imposed upon any positive feedback mechanism: self-stabilising properties of a negative feedback loop constrain the explosive tendencies of positive feedback. This is the starting point of our journey to explore the role of cybernetics in the control of biological growth. That is the assumption that the evolution of self-limitation has been an absolute necessity for life forms with exponential growth." (Tony Stebbing, "A Cybernetic View of Biological Growth: The Maia Hypothesis", 2011)

"In an information economy, entrepreneurs master the science of information in order to overcome the laws of the purely physical sciences. They can succeed because of the surprising power of the laws of information, which are conducive to human creativity. The central concept of information theory is a measure of freedom of choice. The principle of matter, on the other hand, is not liberty but limitation - it has weight and occupies space." (George Gilder, "Knowledge and Power: The Information Theory of Capitalism and How it is Revolutionizing our World", 2013)

🕸Systems Engineering: Similarity (Just the Quotes)

"Symmetry is evidently a kind of unity in variety, where a whole is determined by the rhythmic repetition of similar." (George Santayana, "The Sense of Beauty", 1896)

"To apply the category of cause and effect means to find out which parts of nature stand in this relation. Similarly, to apply the gestalt category means to find out which parts of nature belong as parts to functional wholes, to discover their position in these wholes, their degree of relative independence, and the articulation of larger wholes into sub-wholes." (Kurt Koffka, 1931)

"By a model we thus mean any physical or chemical system which has a similar relation-structure to that of the process it imitates. By ’relation-structure’ I do not mean some obscure non-physical entity which attends the model, but the fact that it is a physical working model which works in the same way as the process it parallels, in the aspects under consideration at any moment." (Kenneth Craik, "The Nature of Explanation", 1943)

"A material model is the representation of a complex system by a system which is assumed simpler and which is also assumed to have some properties similar to those selected for study in the original complex system. A formal model is a symbolic assertion in logical terms of an idealised relatively simple situation sharing the structural properties of the original factual system." (Arturo Rosenblueth & Norbert Wiener, "The Role of Models in Science", Philosophy of Science Vol. 12" (4), 1945)

"Industrial production, the flow of resources in the economy, the exertion of military effort in a war theater-all are complexes of numerous interrelated activities. Differences may exist in the goals to be achieved, the particular processes involved, and the magnitude of effort. Nevertheless, it is possible to abstract the underlying essential similarities in the management of these seemingly disparate systems." (George Dantzig, "Linear programming and extensions", 1963) 

"System' is the concept that refers both to a complex of interdependencies between parts, components, and processes, that involves discernible regularities of relationships, and to a similar type of interdependency between such a complex and its surrounding environment." (Talcott Parsons, "Systems Analysis: Social Systems", 1968)

"The term chaos is used in a specific sense where it is an inherently random pattern of behaviour generated by fixed inputs into deterministic" (that is fixed) rules" (relationships). The rules take the form of non-linear feedback loops. Although the specific path followed by the behaviour so generated is random and hence unpredictable in the long-term, it always has an underlying pattern to it, a 'hidden' pattern, a global pattern or rhythm. That pattern is self-similarity, that is a constant degree of variation, consistent variability, regular irregularity, or more precisely, a constant fractal dimension. Chaos is therefore order" (a pattern) within disorder" (random behaviour)." (Ralph D Stacey, "The Chaos Frontier: Creative Strategic Control for Business", 1991)

"Chaos demonstrates that deterministic causes can have random effects […] There's a similar surprise regarding symmetry: symmetric causes can have asymmetric effects. […] This paradox, that symmetry can get lost between cause and effect, is called symmetry-breaking. […] From the smallest scales to the largest, many of nature's patterns are a result of broken symmetry; […]" (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"The dimensionality and nonlinearity requirements of chaos do not guarantee its appearance. At best, these conditions allow it to occur, and even then under limited conditions relating to particular parameter values. But this does not imply that chaos is rare in the real world. Indeed, discoveries are being made constantly of either the clearly identifiable or arguably persuasive appearance of chaos. Most of these discoveries are being made with regard to physical systems, but the lack of similar discoveries involving human behavior is almost certainly due to the still developing nature of nonlinear analyses in the social sciences rather than the absence of chaos in the human setting. " (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"From a more general philosophical perspective we can say that we wish to model complex systems because we want to understand them better.  The main requirement for our models accordingly shifts from having to be correct to being rich in information.  This does not mean that the relationship between the model and the system itself becomes less important, but the shift from control and prediction to understanding does have an effect on our approach to complexity: the evaluation of our models in terms of performance can be deferred. Once we have a better understanding of the dynamics of complexity, we can start looking for the similarities and differences between different complex systems and thereby develop a clearer understanding of the strengths and limitations of different models." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"The self-similarity of fractal structures implies that there is some redundancy because of the repetition of details at all scales. Even though some of these structures may appear to teeter on the edge of randomness, they actually represent complex systems at the interface of order and disorder. " (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"[…] swarm intelligence is becoming a valuable tool for optimizing the operations of various businesses. Whether similar gains will be made in helping companies better organize themselves and develop more effective strategies remains to be seen. At the very least, though, the field provides a fresh new framework for solving such problems, and it questions the wisdom of certain assumptions regarding the need for employee supervision through command-and-control management. In the future, some companies could build their entire businesses from the ground up using the principles of swarm intelligence, integrating the approach throughout their operations, organization, and strategy. The result: the ultimate self-organizing enterprise that could adapt quickly - and instinctively - to fast-changing markets." (Eric Bonabeau & Christopher Meyer, "Swarm Intelligence: A Whole New Way to Think About Business", Harvard Business Review, 2001)

"Although the detailed moment-to-moment behavior of a chaotic system cannot be predicted, the overall pattern of its 'random' fluctuations may be similar from scale to scale. Likewise, while the fine details of a chaotic system cannot be predicted one can know a little bit about the range of its 'random' fluctuation." (F David Peat, "From Certainty to Uncertainty", 2002)

"Complexity is the characteristic property of complicated systems we don’t understand immediately. It is the amount of difficulties we face while trying to understand it. In this sense, complexity resides largely in the eye of the beholder - someone who is familiar with s.th. often sees less complexity than someone who is less familiar with it. [...] A complex system is created by evolutionary processes. There are multiple pathways by which a system can evolve. Many complex systems are similar, but each instance of a system is unique." (Jochen Fromm, The Emergence of Complexity, 2004)

"Diverse groups of problem solvers outperformed the groups of the best individuals at solving complex problems. The reason: the diverse groups got stuck less often than the smart individuals, who tended to think similarly." (Scott E Page, [interview in The New York Times] 2008)

"A key discovery of network science is that the architecture of networks emerging in various domains of science, nature, and technology are similar to each other, a consequence of being governed by the same organizing principles. Consequently we can use a common set of mathematical tools to explore these systems. " (Albert-László Barabási, "Network Science", 2016)

"The exploding interest in network science during the first decade of the 21st century is rooted in the discovery that despite the obvious diversity of complex systems, the structure and the evolution of the networks behind each system is driven by a common set of fundamental laws and principles. Therefore, notwithstanding the amazing differences in form, size, nature, age, and scope of real networks, most networks are driven by common organizing principles. Once we disregard the nature of the components and the precise nature of the interactions between them, the obtained networks are more similar than different from each other." (Albert-László Barabási, "Network Science", 2016)

See also the quotes on Similarity in Graphical Representation series

26 December 2014

🕸Systems Engineering: Emergence (Just the Quotes)

"[Hierarchy is] the principle according to which entities meaningfully treated as wholes are built up of smaller entities which are themselves wholes […] and so on. In hierarchy, emergent properties denote the levels." (Peter Checkland, "Systems Thinking, Systems Practice", 1981)

"[…] self-organization is the spontaneous emergence of new structures and new forms of behavior in open systems far from equilibrium, characterized by internal feedback loops and described mathematically by nonlinear equations." (Fritjof Capra, "The web of life: a new scientific understanding of living systems", 1996)

"It may not be obvious at first, but the study of emergence and model-building go hand in hand. The essence of model-building is shearing away detail to get at essential elements. A model, by concentrating on selected aspects of the world, makes possible the prediction and planning that reveal new possibilities. That is exactly the problem we face in trying to develop a scientific understanding of emergence." (John H Holland, "Emergence" , Philosophica 59, 1997)

"When the behavior of the system depends on the behavior of the parts, the complexity of the whole must involve a description of the parts, thus it is large. The smaller the parts that must be described to describe the behavior of the whole, the larger the complexity of the entire system. […] A complex system is a system formed out of many components whose behavior is emergent, that is, the behavior of the system cannot be simply inferred from the behavior of its components." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"The phenomenon of emergence takes place at critical points of instability that arise from fluctuations in the environment, amplified by feedback loops." (Fritjof Capra, "The Hidden Connections", 2002)

"This spontaneous emergence of order at critical points of instability is one of the most important concepts of the new understanding of life. It is technically known as self-organization and is often referred to simply as ‘emergence’. It has been recognized as the dynamic origin of development, learning and evolution. In other words, creativity-the generation of new forms-is a key property of all living systems. And since emergence is an integral part of the dynamics of open systems, we reach the important conclusion that open systems develop and evolve. Life constantly reaches out into novelty." (Fritjof  Capra, "The Hidden Connections", 2002)

"Emergence is not really mysterious, although it may be complex. Emergence is brought about by the interactions between the parts of a system. The galloping horse illusion depends upon the persistence of the human retina/brain combination, for instance. Elemental gases bond in combination by sharing outer electrons, thereby altering the appearance and behavior of the combination. In every case of emergence, the source is interaction between the parts - sometimes, as with the brain, very many parts - so that the phenomenon defies simple explanation." (Derek Hitchins, "Advanced Systems Thinking, Engineering and Management", 2003)

"Emergence is the phenomenon of properties, capabilities and behaviours evident in the whole system that are not exclusively ascribable to any of its parts." (Derek Hitchins, "Advanced Systems Thinking, Engineering and Management", 2003)

"Another typical feature of theories of emergence is the layered view of nature. On this view, all things in nature belong to a certain level of existence, each according to its characteristic properties. These levels of existence constitute a hierarchy of increasing complexity that also corresponds to their order of appearance in the course of evolution." (Markus Eronen, "Emergence in the Philosophy of Mind", 2004)

"The basic concept of complexity theory is that systems show patterns of organization without organizer (autonomous or self-organization). Simple local interactions of many mutually interacting parts can lead to emergence of complex global structures. […] Complexity originates from the tendency of large dynamical systems to organize themselves into a critical state, with avalanches or 'punctuations' of all sizes. In the critical state, events which would otherwise be uncoupled became correlated." (Jochen Fromm, "The Emergence of Complexity", 2004)

"Complexity arises when emergent system-level phenomena are characterized by patterns in time or a given state space that have neither too much nor too little form. Neither in stasis nor changing randomly, these emergent phenomena are interesting, due to the coupling of individual and global behaviours as well as the difficulties they pose for prediction. Broad patterns of system behaviour may be predictable, but the system's specific path through a space of possible states is not." (Steve Maguire et al, "Complexity Science and Organization Studies", 2006)

"The beauty of nature insists on taking its time. Everything is prepared. Nothing is rushed. The rhythm of emergence is a gradual, slow beat; always inching its way forward, change remains faithful to itself until the new unfolds in the full confidence of true arrival. Because nothing is abrupt, the beginning of spring nearly always catches us unawares. It is there before we see it; and then we can look nowhere without seeing it. (John O'Donohue, "To Bless the Space Between Us: A Book of Blessings", 2008)

"Although the potential for chaos resides in every system, chaos, when it emerges, frequently stays within the bounds of its attractor(s): No point or pattern of points is ever repeated, but some form of patterning emerges, rather than randomness. Life scientists in different areas have noticed that life seems able to balance order and chaos at a place of balance known as the edge of chaos. Observations from both nature and artificial life suggest that the edge of chaos favors evolutionary adaptation." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"If universality is one of the observed characteristics of complex dynamical systems in many fields of study, a second characteristic that flows from the study of these systems is that of emergence. As self-organizing systems go about their daily business, they are constantly exchanging matter and energy with their environment, and this allows them to remain in a state that is far from equilibrium. That allows spontaneous behavior to give rise to new patterns." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"The notion of emergence is used in a variety of disciplines such as evolutionary biology, the philosophy of mind and sociology, as well as in computational and complexity theory. It is associated with non-reductive naturalism, which claims that a hierarchy of levels of reality exist. While the emergent level is constituted by the underlying level, it is nevertheless autonomous from the constituting level. As a naturalistic theory, it excludes non-natural explanations such as vitalistic forces or entelechy. As non-reductive naturalism, emergence theory claims that higher-level entities cannot be explained by lower-level entities." (Martin Neumann, "An Epistemological Gap in Simulation Technologies and the Science of Society", 2011)

"System theorists know that it's easy to couple simple-to-understand systems into a ‘super system’ that's capable of displaying behavioral modes that cannot be seen in any of its constituent parts. This is the process called ‘emergence’." (John L Casti, [interview with Austin Allen], 2012)

"Every system that has existed emerged somehow, from somewhere, at some point. Complexity science emphasizes the study of how systems evolve through their disorganized parts into an organized whole." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"Things evolve to evolve. Evolutionary processes are the linchpin of change. These processes of discovery represent a complexity of simple systems that flux in perpetual tension as they teeter at the edge of chaos. This whirlwind of emergence is responsible for the spontaneous order and higher, organized complexity so noticeable in biological evolution - one–celled critters beefing up to become multicellular organisms." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"This spontaneous emergence of order at critical points of instability, which is often referred to simply as 'emergence', is one of the hallmarks of life. It has been recognized as the dynamic origin of development, learning, and evolution. In other words, creativity-the generation of new forms-is a key property of all living systems." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

More quotes on "Emergence" at the-web-of-knowledge.blogspot.com.

25 December 2014

🕸Systems Engineering: Connectedness (Just the Quotes)

"The first attempts to consider the behavior of so-called 'random neural nets' in a systematic way have led to a series of problems concerned with relations between the 'structure' and the 'function' of such nets. The 'structure' of a random net is not a clearly defined topological manifold such as could be used to describe a circuit with explicitly given connections. In a random neural net, one does not speak of 'this' neuron synapsing on 'that' one, but rather in terms of tendencies and probabilities associated with points or regions in the net." (Anatol Rapoport, "Cycle distributions in random nets", The Bulletin of Mathematical Biophysics 10(3), 1948)

"A NETWORK is a collection of connected lines, each of which indicates the movement of some quantity between two locations. Generally, entrance to a network is via a source (the starting point) and exit from a network is via a sink (the finishing point); the lines which form the network are called links (or arcs), and the points at which two or more links meet are called nodes." (Cecil W Lowe, "Critical Path Analysis by Bar Chart", 1966)

"The essential vision of reality presents us not with fugitive appearances but with felt patterns of order which have coherence and meaning for the eye and for the mind. Symmetry, balance and rhythmic sequences express characteristics of natural phenomena: the connectedness of nature - the order, the logic, the living process. Here art and science meet on common ground." (Gyorgy Kepes, "The New Landscape: In Art and Science", 1956)

"In fact, it is empirically ascertainable that every event is actually produced by a number of factors, or is at least accompanied by numerous other events that are somehow connected with it, so that the singling out involved in the picture of the causal chain is an extreme abstraction. Just as ideal objects cannot be isolated from their proper context, material existents exhibit multiple interconnections; therefore the universe is not a heap of things but a system of interacting systems." (Mario Bunge, "Causality: The place of the casual principles in modern science", 1959)

"To say a system is 'self-organizing' leaves open two quite different meanings. There is a first meaning that is simple and unobjectionable. This refers to the system that starts with its parts separate (so that the behavior of each is independent of the others' states) and whose parts then act so that they change towards forming connections of some type. Such a system is 'self-organizing' in the sense that it changes from 'parts separated' to 'parts joined'. […] In general such systems can be more simply characterized as 'self-connecting', for the change from independence between the parts to conditionality can always be seen as some form of 'connection', even if it is as purely functional […]  'Organizing' […] may also mean 'changing from a bad organization to a good one' […] The system would be 'self-organizing' if a change were automatically made to the feedback, changing it from positive to negative; then the whole would have changed from a bad organization to a good." (W Ross Ashby, "Principles of the self-organizing system", 1962)

"Information is recorded in vast interconnecting networks. Each idea or image has hundreds, perhaps thousands, of associations and is connected to numerous other points in the mental network." (Peter Russell, "The Brain Book: Know Your Own Mind and How to Use it", 1979)

"All certainty in our relationships with the world rests on acknowledgement of causality. Causality is a genetic connection of phenomena through which one thing (the cause) under certain conditions gives rise to, causes something else (the effect). The essence of causality is the generation and determination of one phenomenon by another." (Alexander Spirkin, "Dialectical Materialism", 1983)

"When loops are present, the network is no longer singly connected and local propagation schemes will invariably run into trouble. [...] If we ignore the existence of loops and permit the nodes to continue communicating with each other as if the network were singly connected, messages may circulate indefinitely around the loops and process may not converges to a stable equilibrium. […] Such oscillations do not normally occur in probabilistic networks […] which tend to bring all messages to some stable equilibrium as time goes on. However, this asymptotic equilibrium is not coherent, in the sense that it does not represent the posterior probabilities of all nodes of the network." (Judea Pearl, "Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference", 1988)

"A self-organizing system not only regulates or adapts its behavior, it creates its own organization. In that respect it differs fundamentally from our present systems, which are created by their designer. We define organization as structure with function. Structure means that the components of a system are arranged in a particular order. It requires both connections, that integrate the parts into a whole, and separations that differentiate subsystems, so as to avoid interference. Function means that this structure fulfils a purpose." (Francis Heylighen & Carlos Gershenson, "The Meaning of Self-organization in Computing", IEEE Intelligent Systems, 2003)

"Nodes and connectors comprise the structure of a network. In contrast, an ecology is a living organism. It influences the formation of the network itself." (George Siemens, "Knowing Knowledge", 2006)

"If a network is solely composed of neighborhood connections, information must traverse a large number of connections to get from place to place. In a small-world network, however, information can be transmitted between any two nodes using, typically, only a small number of connections. In fact, just a small percentage of random, long-distance connections is required to induce such connectivity. This type of network behavior allows the generation of 'six degrees of separation' type results, whereby any agent can connect to any other agent in the system via a path consisting of only a few intermediate nodes." (John H Miller & Scott E Page, "Complex Adaptive Systems", 2007)

"Networks may also be important in terms of view. Many models assume that agents are bunched together on the head of a pin, whereas the reality is that most agents exist within a topology of connections to other agents, and such connections may have an important influence on behavior. […] Models that ignore networks, that is, that assume all activity takes place on the head of a pin, can easily suppress some of the most interesting aspects of the world around us. In a pinhead world, there is no segregation, and majority rule leads to complete conformity - outcomes that, while easy to derive, are of little use." (John H Miller & Scott E Page, "Complex Adaptive Systems", 2007)

"Complexity theory embraces things that are complicated, involve many elements and many interactions, are not deterministic, and are given to unexpected outcomes. […] A fundamental aspect of complexity theory is the overall or aggregate behavior of a large number of items, parts, or units that are entangled, connected, or networked together. […] In contrast to classical scientific methods that directly link theory and outcome, complexity theory does not typically provide simple cause-and-effect explanations." (Robert E Gunther et al, "The Network Challenge: Strategy, Profit, and Risk in an Interlinked World", 2009)

"The simplest basic architecture of an artificial neural network is composed of three layers of neurons - input, output, and intermediary (historically called perceptron). When the input layer is stimulated, each node responds in a particular way by sending information to the intermediary level nodes, which in turn distribute it to the output layer nodes and thereby generate a response. The key to artificial neural networks is in the ways that the nodes are connected and how each node reacts to the stimuli coming from the nodes it is connected to. Just as with the architecture of the brain, the nodes allow information to pass only if a specific stimulus threshold is passed. This threshold is governed by a mathematical equation that can take different forms. The response depends on the sum of the stimuli coming from the input node connections and is 'all or nothing'." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"System dynamics is an approach to understanding the behaviour of over time. It deals with internal feedback loops and time delays that affect the behaviour of the entire system. It also helps the decision maker untangle the complexity of the connections between various policy variables by providing a new language and set of tools to describe. Then it does this by modeling the cause and effect relationships among these variables." (Raed M Al-Qirem & Saad G Yaseen, "Modelling a Small Firm in Jordan Using System Dynamics", 2010)

"We are beginning to see the entire universe as a holographically interlinked network of energy and information, organically whole and self-referential at all scales of its existence. We, and all things in the universe, are non-locally connected with each other and with all other things in ways that are unfettered by the hitherto known limitations of space and time." (Ervin László, "Cosmos: A Co-creator's Guide to the Whole-World", 2010)

"Information is recorded in vast interconnecting networks. Each idea or image has hundreds, perhaps thousands, of associations and is connected to numerous other points in the mental network." (Peter Russell, "The Brain Book: Know Your Own Mind and How to Use it", 2013) 

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 25 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.