Showing posts with label entropy. Show all posts
Showing posts with label entropy. Show all posts

15 December 2018

🔭Data Science: Probability (Just the Quotes)

"Probability is a degree of possibility." (Gottfried W Leibniz, "On estimating the uncertain", 1676)

"Probability, however, is not something absolute, [it is] drawn from certain information which, although it does not suffice to resolve the problem, nevertheless ensures that we judge correctly which of the two opposites is the easiest given the conditions known to us." (Gottfried W Leibniz, "Forethoughts for an encyclopaedia or universal science", cca. 1679)

"[…] the highest probability amounts not to certainty, without which there can be no true knowledge." (John Locke, "An Essay Concerning Human Understanding", 1689)

"As mathematical and absolute certainty is seldom to be attained in human affairs, reason and public utility require that judges and all mankind in forming their opinions of the truth of facts should be regulated by the superior number of the probabilities on the one side or the other whether the amount of these probabilities be expressed in words and arguments or by figures and numbers." (William Murray, 1773)

"All certainty which does not consist in mathematical demonstration is nothing more than the highest probability; there is no other historical certainty." (Voltaire, "A Philosophical Dictionary", 1881)

"Nature prefers the more probable states to the less probable because in nature processes take place in the direction of greater probability. Heat goes from a body at higher temperature to a body at lower temperature because the state of equal temperature distribution is more probable than a state of unequal temperature distribution." (Max Planck, "The Atomic Theory of Matter", 1909)

"Sometimes the probability in favor of a generalization is enormous, but the infinite probability of certainty is never reached." (William Dampier-Whetham, "Science and the Human Mind", 1912)

"There can be no unique probability attached to any event or behaviour: we can only speak of ‘probability in the light of certain given information’, and the probability alters according to the extent of the information." (Sir Arthur S Eddington, "The Nature of the Physical World", 1928)

"[…] the statistical prediction of the future from the past cannot be generally valid, because whatever is future to any given past, is in tum past for some future. That is, whoever continually revises his judgment of the probability of a statistical generalization by its successively observed verifications and failures, cannot fail to make more successful predictions than if he should disregard the past in his anticipation of the future. This might be called the ‘Principle of statistical accumulation’." (Clarence I Lewis, "Mind and the World-Order: Outline of a Theory of Knowledge", 1929)

"Science does not aim, primarily, at high probabilities. It aims at a high informative content, well backed by experience. But a hypothesis may be very probable simply because it tells us nothing, or very little." (Karl Popper, "The Logic of Scientific Discovery", 1934)

"The most important application of the theory of probability is to what we may call 'chance-like' or 'random' events, or occurrences. These seem to be characterized by a peculiar kind of incalculability which makes one disposed to believe - after many unsuccessful attempts - that all known rational methods of prediction must fail in their case. We have, as it were, the feeling that not a scientist but only a prophet could predict them. And yet, it is just this incalculability that makes us conclude that the calculus of probability can be applied to these events." (Karl R Popper, "The Logic of Scientific Discovery", 1934)

"Equiprobability in the physical world is purely a hypothesis. We may exercise the greatest care and the most accurate of scientific instruments to determine whether or not a penny is symmetrical. Even if we are satisfied that it is, and that our evidence on that point is conclusive, our knowledge, or rather our ignorance, about the vast number of other causes which affect the fall of the penny is so abysmal that the fact of the penny’s symmetry is a mere detail. Thus, the statement 'head and tail are equiprobable' is at best an assumption." (Edward Kasner & James R Newman, "Mathematics and the Imagination", 1940)

"Probabilities must be regarded as analogous to the measurement of physical magnitudes; that is to say, they can never be known exactly, but only within certain approximation." (Emile Borel, "Probabilities and Life", 1943)

"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the negative logarithm of its probability. That is, the more probable the message, the less information it gives. Clichés, for example, are less illuminating than great poems." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"To say that observations of the past are certain, whereas predictions are merely probable, is not the ultimate answer to the question of induction; it is only a sort of intermediate answer, which is incomplete unless a theory of probability is developed that explains what we should mean by ‘probable’ and on what ground we can assert probabilities." (Hans Reichenbach, "The Rise of Scientific Philosophy", 1951)

"Uncertainty is introduced, however, by the impossibility of making generalizations, most of the time, which happens to all members of a class. Even scientific truth is a matter of probability and the degree of probability stops somewhere short of certainty." (Wayne C Minnick, "The Art of Persuasion", 1957)

"Everybody has some idea of the meaning of the term 'probability' but there is no agreement among scientists on a precise definition of the term for the purpose of scientific methodology. It is sufficient for our purpose, however, if the concept is interpreted in terms of relative frequency, or more simply, how many times a particular event is likely to occur in a large population." (Alfred R Ilersic, "Statistics", 1959)

"Incomplete knowledge must be considered as perfectly normal in probability theory; we might even say that, if we knew all the circumstances of a phenomenon, there would be no place for probability, and we would know the outcome with certainty." (Félix E Borel, Probability and Certainty", 1963)

"Probability is the mathematics of uncertainty. Not only do we constantly face situations in which there is neither adequate data nor an adequate theory, but many modem theories have uncertainty built into their foundations. Thus learning to think in terms of probability is essential. Statistics is the reverse of probability (glibly speaking). In probability you go from the model of the situation to what you expect to see; in statistics you have the observations and you wish to estimate features of the underlying model." (Richard W Hamming, "Methods of Mathematics Applied to Calculus, Probability, and Statistics", 1985) 

"Probability plays a central role in many fields, from quantum mechanics to information theory, and even older fields use probability now that the presence of 'noise' is officially admitted. The newer aspects of many fields start with the admission of uncertainty." (Richard W Hamming, "Methods of Mathematics Applied to Calculus, Probability, and Statistics", 1985)

"Probabilities are summaries of knowledge that is left behind when information is transferred to a higher level of abstraction." (Judea Pearl, "Probabilistic Reasoning in Intelligent Systems: Network of Plausible, Inference", 1988)

"[In statistics] you have the fact that the concepts are not very clean. The idea of probability, of randomness, is not a clean mathematical idea. You cannot produce random numbers mathematically. They can only be produced by things like tossing dice or spinning a roulette wheel. With a formula, any formula, the number you get would be predictable and therefore not random. So as a statistician you have to rely on some conception of a world where things happen in some way at random, a conception which mathematicians don’t have." (Lucien LeCam, [interview] 1988)

"So we pour in data from the past to fuel the decision-making mechanisms created by our models, be they linear or nonlinear. But therein lies the logician's trap: past data from real life constitute a sequence of events rather than a set of independent observations, which is what the laws of probability demand. [...] It is in those outliers and imperfections that the wildness lurks." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996) 

"Often, we use the word random loosely to describe something that is disordered, irregular, patternless, or unpredictable. We link it with chance, probability, luck, and coincidence. However, when we examine what we mean by random in various contexts, ambiguities and uncertainties inevitably arise. Tackling the subtleties of randomness allows us to go to the root of what we can understand of the universe we inhabit and helps us to define the limits of what we can know with certainty." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"In the laws of probability theory, likelihood distributions are fixed properties of a hypothesis. In the art of rationality, to explain is to anticipate. To anticipate is to explain." (Eliezer S. Yudkowsky, "A Technical Explanation of Technical Explanation", 2005)

"For some scientific data the true value cannot be given by a constant or some straightforward mathematical function but by a probability distribution or an expectation value. Such data are called probabilistic. Even so, their true value does not change with time or place, making them distinctly different from  most statistical data of everyday life." (Manfred Drosg, "Dealing with Uncertainties: A Guide to Error Analysis", 2007)

"In fact, H [entropy] measures the amount of uncertainty that exists in the phenomenon. If there were only one event, its probability would be equal to 1, and H would be equal to 0 - that is, there is no uncertainty about what will happen in a phenomenon with a single event because we always know what is going to occur. The more events that a phenomenon possesses, the more uncertainty there is about the state of the phenomenon. In other words, the more entropy, the more information." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"The four questions of data analysis are the questions of description, probability, inference, and homogeneity. [...] Descriptive statistics are built on the assumption that we can use a single value to characterize a single property for a single universe. […] Probability theory is focused on what happens to samples drawn from a known universe. If the data happen to come from different sources, then there are multiple universes with different probability models.  [...] Statistical inference assumes that you have a sample that is known to have come from one universe." (Donald J Wheeler," Myths About Data Analysis", International Lean & Six Sigma Conference, 2012)

"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution. The entropy and amount of unpredictability are directly proportional to each other." (G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)

24 November 2018

🔭Data Science: Noise (Just the Quotes)

"Information that is only partially structured (and therefore contains some 'noise' is fuzzy, inconsistent, and indistinct. Such imperfect information may be regarded as having merit only if it represents an intermediate step in structuring the information into a final meaningful form. If the partially Structured information remains in fuzzy form, it will create a state of dissatisfaction in the mind of the originator and certainly in the mind of the recipient. The natural desire is to continue structuring until clarity, simplicity, precision, and definitiveness are obtained." (Cecil H Meyers, "Handbook of Basic Graphs: A modern approach", 1970)

"To understand the need for structuring information, we should examine its opposite - nonstructured information. Nonstructured information may be thought of as exists and can be heard (or sensed with audio devices), but the mind attaches no rational meaning to the sound. In another sense, noise can be equated to writing a group of letters, numbers, and other symbols on a page without any design or key to their meaning. In such a situation, there is nothing the mind can grasp. Nonstructured information can be classified as useless, unless meaning exists somewhere in the jumble and a key can be found to unlock its hidden significance." (Cecil H Meyers, "Handbook of Basic Graphs: A modern approach", 1970)

"Neither noise nor information is predictable." (Ray Kurzweil, "The Age of Spiritual Machines: When Computers Exceed Human Intelligence", 1999)

"Data are collected as a basis for action. Yet before anyone can use data as a basis for action the data have to be interpreted. The proper interpretation of data will require that the data be presented in context, and that the analysis technique used will filter out the noise."  (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"Data are generally collected as a basis for action. However, unless potential signals are separated from probable noise, the actions taken may be totally inconsistent with the data. Thus, the proper use of data requires that you have simple and effective methods of analysis which will properly separate potential signals from probable noise." (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"No matter what the data, and no matter how the values are arranged and presented, you must always use some method of analysis to come up with an interpretation of the data. While every data set contains noise, some data sets may contain signals. Therefore, before you can detect a signal within any given data set, you must first filter out the noise." (Donald J Wheeler," Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"We analyze numbers in order to know when a change has occurred in our processes or systems. We want to know about such changes in a timely manner so that we can respond appropriately. While this sounds rather straightforward, there is a complication - the numbers can change even when our process does not. So, in our analysis of numbers, we need to have a way to distinguish those changes in the numbers that represent changes in our process from those that are essentially noise." (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"While all data contain noise, some data contain signals. Before you can detect a signal, you must filter out the noise." (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"The acquisition of information is a flow from noise to order - a process converting entropy to redundancy. During this process, the amount of information decreases but is compensated by constant re-coding. In the recoding the amount of information per unit increases by means of a new symbol which represents the total amount of the old. The maturing thus implies information condensation. Simultaneously, the redundance decreases, which render the information more difficult to interpret." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"In fact, an information theory that leaves out the issue of noise turns out to have no content." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"This phenomenon, common to chaos theory, is also known as sensitive dependence on initial conditions. Just a small change in the initial conditions can drastically change the long-term behavior of a system. Such a small amount of difference in a measurement might be considered experimental noise, background noise, or an inaccuracy of the equipment." (Greg Rae, Chaos Theory: A Brief Introduction, 2006)

"Data analysis is not generally thought of as being simple or easy, but it can be. The first step is to understand that the purpose of data analysis is to separate any signals that may be contained within the data from the noise in the data. Once you have filtered out the noise, anything left over will be your potential signals. The rest is just details." (Donald J Wheeler," Myths About Data Analysis", International Lean & Six Sigma Conference, 2012)

"Distinguishing the signal from the noise requires both scientific knowledge and self-knowledge." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Economists should study financial markets as they actually operate, not as they assume them to operate - observing the way in which information is actually processed, observing the serial correlations, bonanzas, and sudden stops, not assuming these away as noise around the edges of efficient and rational markets." (Adair Turner, "Economics after the Crisis: Objectives and means", 2012)

"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The signal is the truth. The noise is what distracts us from the truth." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Typically, most outlier detection algorithms use some quantified measure of the outlierness of a data point, such as the sparsity of the underlying region, nearest neighbor based distance, or the fit to the underlying data distribution. Every data point lies on a continuous spectrum from normal data to noise, and finally to anomalies [...] The separation of the different regions of this spectrum is often not precisely defined, and is chosen on an ad-hoc basis according to application-specific criteria. Furthermore, the separation between noise and anomalies is not pure, and many data points created by a noisy generative process may be deviant enough to be interpreted as anomalies on the basis of the outlier score. Thus, anomalies will typically have a much higher outlier score than noise, but this is not a distinguishing factor between the two as a matter of definition. Rather, it is the interest of the analyst, which regulates the distinction between noise and an anomaly." (Charu C Aggarwal, "Outlier Analysis", 2013)

"A complete data analysis will involve the following steps: (i) Finding a good model to fit the signal based on the data. (ii) Finding a good model to fit the noise, based on the residuals from the model. (iii) Adjusting variances, test statistics, confidence intervals, and predictions, based on the model for the noise.(DeWayne R Derryberry, "Basic data analysis for time series with R", 2014)

 "The random element in most data analysis is assumed to be white noise - normal errors independent of each other. In a time series, the errors are often linked so that independence cannot be assumed (the last examples). Modeling the nature of this dependence is the key to time series.(DeWayne R Derryberry, "Basic data analysis for time series with R", 2014)

"A signal is a useful message that resides in data. Data that isn’t useful is noise. […] When data is expressed visually, noise can exist not only as data that doesn’t inform but also as meaningless non-data elements of the display (e.g. irrelevant attributes, such as a third dimension of depth in bars, color variation that has no significance, and artificial light and shadow effects)." (Stephen Few, "Signal: Understanding What Matters in a World of Noise", 2015)

"Data contain descriptions. Some are true, some are not. Some are useful, most are not. Skillful use of data requires that we learn to pick out the pieces that are true and useful. [...] To find signals in data, we must learn to reduce the noise - not just the noise that resides in the data, but also the noise that resides in us. It is nearly impossible for noisy minds to perceive anything but noise in data." (Stephen Few, "Signal: Understanding What Matters in a World of Noise", 2015)

"When we find data quality issues due to valid data during data exploration, we should note these issues in a data quality plan for potential handling later in the project. The most common issues in this regard are missing values and outliers, which are both examples of noise in the data." (John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, worked examples, and case studies", 2015)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

"Repeated observations of the same phenomenon do not always produce the same results, due to random noise or error. Sampling errors result when our observations capture unrepresentative circumstances, like measuring rush hour traffic on weekends as well as during the work week. Measurement errors reflect the limits of precision inherent in any sensing device. The notion of signal to noise ratio captures the degree to which a series of observations reflects a quantity of interest as opposed to data variance. As data scientists, we care about changes in the signal instead of the noise, and such variance often makes this problem surprisingly difficult." (Steven S Skiena, "The Data Science Design Manual", 2017)

"Using noise (the uncorrelated variables) to fit noise (the residual left from a simple model on the genuinely correlated variables) is asking for trouble." (Steven S Skiena, "The Data Science Design Manual", 2017)

"The high generalization error in a neural network may be caused by several reasons. First, the data itself might have a lot of noise, in which case there is little one can do in order to improve accuracy. Second, neural networks are hard to train, and the large error might be caused by the poor convergence behavior of the algorithm. The error might also be caused by high bias, which is referred to as underfitting. Finally, overfitting (i.e., high variance) may cause a large part of the generalization error. In most cases, the error is a combination of more than one of these different factors." (Charu C Aggarwal, "Neural Networks and Deep Learning: A Textbook", 2018)

"[...] in the statistical world, what we see and measure around us can be considered as the sum of a systematic mathematical idealized form plus some random contribution that cannot yet be explained. This is the classic idea of the signal and the noise." (David Spiegelhalter, "The Art of Statistics: Learning from Data", 2019)

"Visualizations can remove the background noise from enormous sets of data so that only the most important points stand out to the intended audience. This is particularly important in the era of big data. The more data there is, the more chance for noise and outliers to interfere with the core concepts of the data set." (Kate Strachnyi, "ColorWise: A Data Storyteller’s Guide to the Intentional Use of Color", 2023)

23 December 2014

🕸Systems Engineering: Entropy (Just the Quotes)

"The second law of thermodynamics appears solely as a law of probability, entropy as a measure of the probability, and the increase of entropy is equivalent to a statement that more probable events follow less probable ones." (Max Planck, "A Survey of Physics", 1923)

"True equilibria can occur only in closed systems and that, in open systems, disequilibria called ‘steady states’, or ‘flow equilibria’ are the predominant and characteristic feature. According to the second law of thermodynamics a closed system must eventually attain a time-independent equilibrium state, with maximum entropy and minimum free energy. An open system may, under certain conditions, attain a time-independent state where the system remains constant as a whole and in its phases, though there is a continuous flow of component materials. This is called a steady state. Steady states are irreversible as a whole. […] A closed system in equilibrium does not need energy for its preservation, nor can energy be obtained from it. In order to perform work, a system must be in disequilibrium, tending toward equilibrium and maintaining a steady state, Therefore the character of an open system is the necessary condition for the continuous working capacity of the organism." (Ludwig on Bertalanffy, "Theoretische Biologie: Band 1: Allgemeine Theorie, Physikochemie, Aufbau und Entwicklung des Organismus", 1932)

"An isolated system or a system in a uniform environment (which for the present consideration we do best to include as a part of the system we contemplate) increases its entropy and more or less rapidly approaches the inert state of maximum entropy. We now recognize this fundamental law of physics to be just the natural tendency of things to approach the chaotic state (the same tendency that the books of a library or the piles of papers and manuscripts on a writing desk display) unless we obviate it. (The analogue of irregular heat motion, in this case, is our handling those objects now and again without troubling to put them back in their proper places.)" (Erwin Schrödinger, "What is Life?", 1944)

"Every process, event, happening - call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on. Thus a living organism continually increases its entropy – or, as you may say, produces positive entropy – and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy – which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy. Or, to put it less paradoxically, the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive." (Erwin Schrödinger, "What is Life?", 1944)

"Time itself will come to an end. For entropy points the direction of time. Entropy is the measure of randomness. When all system and order in the universe have vanished, when randomness is at its maximum, and entropy cannot be increased, when there is no longer any sequence of cause and effect, in short when the universe has run down, there will be no direction to time - there will be no time." (Lincoln Barnett, "The Universe and Dr. Einstein", 1948)

"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the negative logarithm of its probability. That is, the more probable the message, the less information it gives. Clichés, for example, are less illuminating than great poems." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"[…] the characteristic tendency of entropy is to increase. As entropy increases, the universe, and all closed systems in the universe, tend naturally to deteriorate and lose their distinctiveness, to move from the least to the most probable state, from a state of organization and differentiation in which distinctions and forms exist, to a state of chaos and sameness." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"The powerful notion of entropy, which comes from a very special branch of physics […] is certainly useful in the study of communication and quite helpful when applied in the theory of language." (J Robert Oppenheimer, "The Growth of Science and the Structure of Culture", Daedalus 87 (1), 1958) 

"Entropy is a measure of the heat energy in a substance that has been lost and is no longer available for work. It is a measure of the deterioration of a system." (William B. Sill & Norman Hoss (Eds.), "Popular Science Encyclopedia of the Sciences", 1963)

"Suppose we divide the space into little volume elements. If we have black and white molecules, how many ways could we distribute them among the volume elements so that white is on one side and black is on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure 'disorder' by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy. The number of ways in the separated case is less, so the entropy is less, or the 'disorder' is less." (Richard P Feynman, "Order And Entropy" ["The Feynman Lectures on Physics"], 1964)

"The homeostatic principle does not apply literally to the functioning of all complex living systems, in that in counteracting entropy they move toward growth and expansion." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"Higher, directed forms of energy (e.g., mechanical, electric, chemical) are dissipated, that is, progressively converted into the lowest form of energy, i.e., undirected heat movement of molecules; chemical systems tend toward equilibria with maximum entropy; machines wear out owing to friction; in communication channels, information can only be lost by conversion of messages into noise but not vice versa, and so forth." (Ludwig von Bertalanffy, "Robots, Men and Minds", 1967)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"In an isolated system, which cannot exchange energy and matter with the surroundings, this tendency is expressed in terms of a function of the macroscopic state of the system: the entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972) 

"There is nothing supernatural about the process of self-organization to states of higher entropy; it is a general property of systems, regardless of their materials and origin. It does not violate the Second Law of thermodynamics since the decrease in entropy within an open system is always offset by the increase of entropy in its surroundings." (Ervin László, "Introduction to Systems Philosophy", 1972)

"Entropy theory, on the other hand, is not concerned with the probability of succession in a series of items but with the overall distribution of kinds of items in a given arrangement." (Rudolf Arnheim, "Entropy and Art: An Essay on Disorder and Order", 1974) 

"The amount of information conveyed by the message increases as the amount of uncertainty as to what message actually will be produced becomes greater. A message which is one out of ten possible messages conveys a smaller amount of information than a message which is one out of a million possible messages. The entropy of communication theory is a measure of this uncertainty and the uncertainty, or entropy, is taken as the measure of the amount of information conveyed by a message from a source. The more we know about what message the source will produce, the less uncertainty, the less the entropy, and the less the information." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals and Noise", 1979) 

"Thus, an increase in entropy means a decrease in our ability to change thermal energy, the energy of heat, into mechanical energy. An increase of entropy means a decrease of available energy." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals and Noise", 1979)

"The third model regards mind as an information processing system. This is the model of mind subscribed to by cognitive psychologists and also to some extent by the ego psychologists. Since an acquisition of information entails maximization of negative entropy and complexity, this model of mind assumes mind to be an open system." (Thaddus E Weckowicz, "Models of Mental Illness", 1984) 

"Disorder increases with time because we measure time in the direction in which disorder increases." (Stephen W Hawking, "The Direction of Time", New Scientist 115 (1568), 1987)

"Somehow, after all, as the universe ebbs toward its final equilibrium in the featureless heat bath of maximum entropy, it manages to create interesting structures." (James Gleick, "Chaos: Making a New Science", 1987)

"Just like a computer, we must remember things in the order in which entropy increases. This makes the second law of thermodynamics almost trivial. Disorder increases with time because we measure time in the direction in which disorder increases."  (Stephen Hawking, "A Brief History of Time", 1988)

"The new information technologies can be seen to drive societies toward increasingly dynamic high-energy regions further and further from thermodynamical equilibrium, characterized by decreasing specific entropy and increasingly dense free-energy flows, accessed and processed by more and more complex social, economic, and political structures." (Ervin László, "Information Technology and Social Change: An Evolutionary Systems Analysis", Behavioral Science 37, 1992) 

"The second law of thermodynamics, which requires average entropy (or disorder) to increase, does not in any way forbid local order from arising through various mechanisms of self-organization, which can turn accidents into frozen ones producing extensive regularities. Again, such mechanisms are not restricted to complex adaptive systems." (Murray Gell-Mann, "What is Complexity?", Complexity Vol 1 (1), 1995)

"All systems evolve, although the rates of evolution may vary over time both between and within systems. The rate of evolution is a function of both the inherent stability of the system and changing environmental circumstances. But no system can be stabilized forever. For the universe as a whole, an isolated system, time’s arrow points toward greater and greater breakdown, leading to complete molecular chaos, maximum entropy, and heat death. For open systems, including the living systems that are of major interest to us and that interchange matter and energy with their external environments, time’s arrow points to evolution toward greater and greater complexity. Thus, the universe consists of islands of increasing order in a sea of decreasing order. Open systems evolve and maintain structure by exporting entropy to their external environments." (L Douglas Kiel, "Chaos Theory in the Social Sciences: Foundations and Applications", 1996)

"Contrary to what happens at equilibrium, or near equilibrium, systems far from equilibrium do not conform to any minimum principle that is valid for functions of free energy or entropy production." (Ilya Prigogine, "The End of Certainty: Time, Chaos, and the New Laws of Nature", 1996) 

"Complex systems operate under conditions far from equilibrium. Complex systems need a constant flow of energy to change, evolve and survive as complex entities. Equilibrium, symmetry and complete stability mean death. Just as the flow, of energy is necessary to fight entropy and maintain the complex structure of the system, society can only survive as a process. It is defined not by its origins or its goals, but by what it is doing." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"In a closed system, the change in entropy must always be 'positive', meaning toward death. However, in open biological or social systems, entropy can be arrested and may even be transformed into negative entropy - a process of more complete organization and enhanced ability to transform resources. Why? Because the system imports energy and resources from its environment, leading to renewal. This is why education and learning are so important, as they provide new and stimulating input (termed neg-entropy) that can transform each of us." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"Physical systems are subject to the force of entropy, which increases until eventually the entire system fails. The tendency toward maximum entropy is a movement to disorder, complete lack of resource transformation, and death." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"All systems have a tendency toward maximum entropy, disorder, and death. Importing resources from the environment is key to long-term viability; closed systems move toward this disorganization faster than open systems." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Defined from a societal standpoint, information may be seen as an entity which reduces maladjustment between system and environment. In order to survive as a thermodynamic entity, all social systems are dependent upon an information flow. This explanation is derived from the parallel between entropy and information where the latter is regarded as negative entropy (negentropy). In more common terms information is a form of processed data or facts about objects, events or persons, which are meaningful for the receiver, inasmuch as an increase in knowledge reduces uncertainty." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The acquisition of information is a flow from noise to order - a process converting entropy to redundancy. During this process, the amount of information decreases but is compensated by constant recoding. In the recoding the amount of information per unit increases by means of a new symbol which represents the total amount of the old. The maturing thus implies information condensation. Simultaneously, the redundance decreases, which render the information more difficult to interpret." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The function of living matter is apparently to expand the organization of the universe. Here, locally decreased entropy as a result of biological order in existing life is invalidating the effects of the second law of thermodynamics, although at the expense of increased entropy in the whole system. It is the running down of the universe that made the sun and the earth possible. It is the running down of the sun that made life and us possible." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Nature normally hates power laws. In ordinary systems all quantities follow bell curves, and correlations decay rapidly, obeying exponential laws. But all that changes if the system is forced to undergo a phase transition. Then power laws emerge-nature's unmistakable sign that chaos is departing in favor of order. The theory of phase transitions told us loud and clear that the road from disorder to order is maintained by the powerful forces of self-organization and is paved by power laws. It told us that power laws are not just another way of characterizing a system's behavior. They are the patent signatures of self-organization in complex systems." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information." (Hans C von Baeyer," Information, The New Language of Science", 2003)

"The total disorder in the universe, as measured by the quantity that physicists call entropy, increases steadily steadily as we go from past to future. On the other hand, the total order in the universe, as measured by the complexity and permanence of organized structures, also increases steadily as we go from past to future." (Freeman Dyson, [Page-Barbour lecture], 2004)

"At the foundation of classical thermodynamics are the first and second laws. The first law formulates that the total energy of a system is conserved, while the second law states that the entropy of an isolated system can only increase. The second law implies that the free energy of an isolated system is successively degraded by diabatic processes over time, leading to entropy production. This eventually results in an equilibrium state of maximum entropy. In its statistical interpretation, the direction towards higher entropy can be interpreted as a transition to more probable states." (Axel Kleidon & Ralph D Lorenz, "Entropy Production by Earth System Processes" [in "Non- quilibrium Thermodynamics and the Production of Entropy"], 2005)

"However, the law of accelerating returns pertains to evolution, which is not a closed system. It takes place amid great chaos and indeed depends on the disorder in its midst, from which it draws its options for diversity. And from these options, an evolutionary process continually prunes its choices to create ever greater order."  (Ray Kurzweil, "The Singularity is Near", 2005)

"The second law of thermodynamics states that in an isolated system, entropy can only increase, not decrease. Such systems evolve to their state of maximum entropy, or thermodynamic equilibrium. Therefore, physical self-organizing systems cannot be isolated: they require a constant input of matter or energy with low entropy, getting rid of the internally generated entropy through the output of heat ('dissipation'). This allows them to produce ‘dissipative structures’ which maintain far from thermodynamic equilibrium. Life is a clear example of order far from thermodynamic equilibrium." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"We have to be aware that even in mathematical and physical models of self-organizing systems, it is the observer who ascribes properties, aspects, states, and probabilities; and therefore entropy or order to the system. But organization is more than low entropy: it is structure that has a function or purpose." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Heat is the energy of random chaotic motion, and entropy is the amount of hidden microscopic information." (Leonard Susskind, "The Black Hole War", 2008)

"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010) 

"The laws of thermodynamics tell us something quite different. Economic activity is merely borrowing low-entropy energy inputs from the environment and transforming them into temporary products and services of value. In the transformation process, often more energy is expended and lost to the environment than is embedded in the particular good or service being produced." (Jeremy Rifkin, "The Third Industrial Revolution", 2011)

"In a physical system, information is the opposite of entropy, as it involves uncommon and highly correlated configurations that are difficult to arrive at." (César A Hidalgo, "Why Information Grows: The Evolution of Order, from Atoms to Economies", 2015)

"The passage of time and the action of entropy bring about ever-greater complexity - a branching, blossoming tree of possibilities. Blossoming disorder (things getting worse), now unfolding within the constraints of the physics of our universe, creates novel opportunities for spontaneous ordered complexity to arise." (D J MacLennan, "Frozen to Life", 2015)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

"The Second Law of Thermodynamics states that in an isolated system (one that is not taking in energy), entropy never decreases. (The First Law is that energy is conserved; the Third, that a temperature of absolute zero is unreachable.) Closed systems inexorably become less structured, less organized, less able to accomplish interesting and useful outcomes, until they slide into an equilibrium of gray, tepid, homogeneous monotony and stay there." (Steven Pinker, "The Second Law of Thermodynamics", 2017)

"In information theory this notion, introduced by Claude Shannon, is used to express unpredictability of information content. For instance, if a data set containing n items was divided into k groups each comprising n i items, then the entropy of such a partition is H = p 1 log( p 1 ) + … + p k log( p k ), where p i = n i / n . In case of two alternative partitions, the mutual information is a measure of the mutual dependence between these partitions." (Slawomir T Wierzchon, "Ensemble Clustering Data Mining and Databases", 2018) [where i is used as index]

"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution. The entropy and amount of unpredictability are directly proportional to each other." ("G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)

"In the physics [entropy is the] rate of system's messiness or disorder in a physical system. In the social systems theory - social entropy is a sociological theory that evaluates social behaviors using a method based on the second law of thermodynamics." (Justína Mikulášková et al, "Spiral Management: New Concept of the Social Systems Management", 2020)

More quotes on "Entropy" at the-web-of-knowledge.blogspot.com.

15 December 2014

🕸Systems Engineering: Symmetry (Just the Quotes)

"In every symmetrical system every deformation that tends to destroy the symmetry is complemented by an equal and opposite deformation that tends to restore it. […] One condition, therefore, though not an absolutely sufficient one, that a maximum or minimum of work corresponds to the form of equilibrium, is thus applied by symmetry." (Ernst Mach, "The Science of Mechanics: A Critical and Historical Account of Its Development", 1893)

"Symmetry is evidently a kind of unity in variety, where a whole is determined by the rhythmic repetition of similar." (George Santayana, "The Sense of Beauty", 1896)

"It is the harmony of the diverse parts, their symmetry, their happy balance; in a word it is all that introduces order, all that gives unity, that permits us to see clearly and to comprehend at once both the ensemble and the details." (Henri Poincaré, "The Future of Mathematics", Monist, Vol. 20 (1), 1910)

"But seldom is asymmetry merely the absence of symmetry. Even in asymmetric designs one feels symmetry as the norm from which one deviates under the influence of forces of non-formal character." (Hermann Weyl, "Symmetry", 1952)

"[…] nature, at the fundamental level, does not just prefer symmetry in a physical theory; nature demands it." (Jennifer T Thompson, "Beyond Einstein: The Cosmic Quest for the Theory of the Universe", 1987)

"Chaos demonstrates that deterministic causes can have random effects […] There's a similar surprise regarding symmetry: symmetric causes can have asymmetric effects. […] This paradox, that symmetry can get lost between cause and effect, is called symmetry-breaking. […] From the smallest scales to the largest, many of nature's patterns are a result of broken symmetry; […]" (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"In everyday language, the words 'pattern' and 'symmetry' are used almost interchangeably, to indicate a property possessed by a regular arrangement of more-or-less identical units […]" (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"Nature behaves in ways that look mathematical, but nature is not the same as mathematics. Every mathematical model makes simplifying assumptions; its conclusions are only as valid as those assumptions. The assumption of perfect symmetry is excellent as a technique for deducing the conditions under which symmetry-breaking is going to occur, the general form of the result, and the range of possible behaviour. To deduce exactly which effect is selected from this range in a practical situation, we have to know which imperfections are present" (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"Symmetry breaking in psychology is governed by the nonlinear causality of complex systems (the 'butterfly effect'), which roughly means that a small cause can have a big effect. Tiny details of initial individual perspectives, but also cognitive prejudices, may 'enslave' the other modes and lead to one dominant view." (Klaus Mainzer, "Thinking in Complexity", 1994)

"Complex systems operate under conditions far from equilibrium. Complex systems need a constant flow of energy to change, evolve and survive as complex entities. Equilibrium, symmetry and complete stability mean death. Just as the flow, of energy is necessary to fight entropy and maintain the complex structure of the system, society can only survive as a process. It is defined not by its origins or its goals, but by what it is doing." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"A sudden change in the evolutive dynamics of a system (a ‘surprise’) can emerge, apparently violating a symmetrical law that was formulated by making a reduction on some (or many) finite sequences of numerical data. This is the crucial point. As we have said on a number of occasions, complexity emerges as a breakdown of symmetry (a system that, by evolving with continuity, suddenly passes from one attractor to another) in laws which, expressed in mathematical form, are symmetrical. Nonetheless, this breakdown happens. It is the surprise, the paradox, a sort of butterfly effect that can highlight small differences between numbers that are very close to one another in the continuum of real numbers; differences that may evade the experimental interpretation of data, but that may increasingly amplify in the system’s dynamics." (Cristoforo S Bertuglia & Franco Vaio, "Nonlinearity, Chaos, and Complexity: The Dynamics of Natural and Social Systems", 2003)

"A symmetry is a set of transformations applied to a structure, such that the transformations preserve the properties of the structure." (Philip Dorrell, "What is Music?: Solving a Scientific Mystery", 2004)

"A graph enables us to visualize a relation over a set, which makes the characteristics of relations such as transitivity and symmetry easier to understand. […] Notions such as paths and cycles are key to understanding the more complex and powerful concepts of graph theory. There are many degrees of connectedness that apply to a graph; understanding these types of connectedness enables the engineer to understand the basic properties that can be defined for the graph representing some aspect of his or her system. The concepts of adjacency and reachability are the first steps to understanding the ability of an allocated architecture of a system to execute properly." (Dennis M Buede, "The Engineering Design of Systems: Models and methods", 2009)

More quotes on "Symmetry" at the-web-of-knowledge.blogspot.com.

05 December 2014

🕸Systems Engineering: Environment (Just the Quotes)

"The change from one stable equilibrium to the other may take place as the result of the isolation of a small unrepresentative group of the population, a temporary change in the environment which alters the relative viability of different types, or in several other ways." (John B S Haldane, "The Causes of Evolution", 1932)

"An isolated system or a system in a uniform environment (which for the present consideration we do best to include as a part of the system we contemplate) increases its entropy and more or less rapidly approaches the inert state of maximum entropy. We now recognize this fundamental law of physics to be just the natural tendency of things to approach the chaotic state (the same tendency that the books of a library or the piles of papers and manuscripts on a writing desk display) unless we obviate it. (The analogue of irregular heat motion, in this case, is our handling those objects now and again without troubling to put them back in their proper places.) (Erwin Schrödinger, "What is Life?", 1944)

"Clearly, if the state of the system is coupled to parameters of an environment and the state of the environment is made to modify parameters of the system, a learning process will occur. Such an arrangement will be called a Finite Learning Machine, since it has a definite capacity. It is, of course, an active learning mechanism which trades with its surroundings. Indeed it is the limit case of a self-organizing system which will appear in the network if the currency supply is generalized." (Gordon Pask, "The Natural History of Networks", 1960)

"Every isolated determinate dynamic system, obeying unchanging laws, will ultimately develop some sort of organisms that are adapted to their environments." (W Ross Ashby, "Principles of the self-organizing system", 1962)

"[...] in a state of dynamic equilibrium with their environments. If they do not maintain this equilibrium they die; if they do maintain it they show a degree of spontaneity, variability, and purposiveness of response unknown in the non-living world. This is what is meant by ‘adaptation to environment’ […] [Its] essential feature […] is stability - that is, the ability to withstand disturbances." (Kenneth Craik, 'Living organisms', "The Nature of Psychology", 1966)

"System theory is basically concerned with problems of relationships, of structure, and of interdependence rather than with the constant attributes of objects. In general approach it resembles field theory except that its dynamics deal with temporal as well as spatial patterns. Older formulations of system constructs dealt with the closed systems of the physical sciences, in which relatively self-contained structures could be treated successfully as if they were independent of external forces. But living systems, whether biological organisms or social organizations, are acutely dependent on their external environment and so must be conceived of as open systems." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"Conventional physics deals only with closed systems, i.e. systems which are considered to be isolated from their environment. [...] However, we find systems which by their very nature and definition are not closed systems. Every living organism is essentially an open system. It maintains itself in a continuous inflow and outflow, a building up and breaking down of components, never being, so long as it is alive, in a state of chemical and thermodynamic equilibrium but maintained in a so-called steady state which is distinct from the latter." (Ludwig von Bertalanffy, "General System Theory", 1968)

"System' is the concept that refers both to a complex of interdependencies between parts, components, and processes, that involves discernible regularities of relationships, and to a similar type of interdependency between such a complex and its surrounding environment." (Talcott Parsons, "Systems Analysis: Social Systems", 1968)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"The main object of cybernetics is to supply adaptive, hierarchical models, involving feedback and the like, to all aspects of our environment. Often such modelling implies simulation of a system where the simulation should achieve the object of copying both the method of achievement and the end result. Synthesis, as opposed to simulation, is concerned with achieving only the end result and is less concerned (or completely unconcerned) with the method by which the end result is achieved. In the case of behaviour, psychology is concerned with simulation, while cybernetics, although also interested in simulation, is primarily concerned with synthesis." (Frank H George, "Soviet Cybernetics, the militairy and Professor Lerner", New Scientist, 1973)

"For any system the environment is always more complex than the system itself. No system can maintain itself by means of a point-for-point correlation with its environment, i.e., can summon enough 'requisite variety' to match its environment. So each one has to reduce environmental complexity - primarily by restricting the environment itself and perceiving it in a categorically preformed way. On the other hand, the difference of system and environment is a prerequisite for the reduction of complexity because reduction can be performed only within the system, both for the system itself and its environment." (Thomas Luckmann & Niklas Luhmann, "The Differentiation of Society", 1977)

"General systems theory and cybernetics supplanted the classical conceptual model of a whole made out of parts and relations between parts with a model emphasizing the difference between systems and environments. This new paradigm made it possible to relate both the structures (including forms of differentiation) and processes of systems to the environment." (Thomas Luckmann & Niklas Luhmann, "The Differentiation of Society", 1977)

"There is a strong current in contemporary culture advocating ‘holistic’ views as some sort of cure-all […] Reductionism implies attention to a lower level while holistic implies attention to higher level. These are intertwined in any satisfactory description: and each entails some loss relative to our cognitive preferences, as well as some gain [...] there is no whole system without an interconnection of its parts and there is no whole system without an environment." (Francisco Varela, "On being autonomous: The lessons of natural history for systems theory", 1977)

"Every system of whatever size must maintain its own structure and must deal with a dynamic environment, i.e., the system must strike a proper balance between stability and change. The cybernetic mechanisms for stability (i.e., homeostasis, negative feedback, autopoiesis, equifinality) and change (i.e., positive feedback, algedonodes, self-organization) are found in all viable systems." (Barry Clemson, "Cybernetics: A New Management Tool", 1984)

"Any system that insulates itself from diversity in the environment tends to atrophy and lose its complexity and distinctive nature." (Gareth Morgan, "Images of Organization", 1986)

"All systems evolve, although the rates of evolution may vary over time both between and within systems. The rate of evolution is a function of both the inherent stability of the system and changing environmental circumstances. But no system can be stabilized forever. For the universe as a whole, an isolated system, time’s arrow points toward greater and greater breakdown, leading to complete molecular chaos, maximum entropy, and heat death. For open systems, including the living systems that are of major interest to us and that interchange matter and energy with their external environments, time’s arrow points to evolution toward greater and greater complexity. Thus, the universe consists of islands of increasing order in a sea of decreasing order. Open systems evolve and maintain structure by exporting entropy to their external environments." (L Douglas Kiel, "Chaos Theory in the Social Sciences: Foundations and Applications", 1996)

"In a closed system, the change in entropy must always be 'positive', meaning toward death. However, in open biological or social systems, entropy can be arrested and may even be transformed into negative entropy - a process of more complete organization and enhanced ability to transform resources. Why? Because the system imports energy and resources from its environment, leading to renewal. This is why education and learning are so important, as they provide new and stimulating input (termed neg-entropy) that can transform each of us." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"Neural networks conserve the complexity of the systems they model because they have complex structures themselves. Neural networks encode information about their environment in a distributed form. […] Neural networks have the capacity to self-organise their internal structure." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)

"All systems have a tendency toward maximum entropy, disorder, and death. Importing resources from the environment is key to long-term viability; closed systems move toward this disorganization faster than open systems." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Systems thinking practices the exact opposite of this analytic approach. Systems thinking studies the organization as a whole in its interaction with its environment. Then, it works backwards to understand how each part of that whole works in relation to, and support of, the entire system’s objectives. Only then can the core strategies be formulated." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Systems, and organizations as systems, can only be understood holistically. Try to understand the system and its environment first. Organizations are open systems and, as such, are viable only in interaction with and adaptation to the changing environment." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The phenomenon of emergence takes place at critical points of instability that arise from fluctuations in the environment, amplified by feedback loops." (Fritjof Capra, "The Hidden Connections: A Science for Sustainable Living", 2002)

"Feedback and its big brother, control theory, are such important concepts that it is odd that they usually find no formal place in the education of physicists. On the practical side, experimentalists often need to use feedback. Almost any experiment is subject to the vagaries of environmental perturbations. Usually, one wants to vary a parameter of interest while holding all others constant. How to do this properly is the subject of control theory. More fundamentally, feedback is one of the great ideas developed (mostly) in the last century, with particularly deep consequences for biological systems, and all physicists should have some understanding of such a basic concept." (John Bechhoefer, "Feedback for physicists: A tutorial essay on control", Reviews of Modern Physics Vol. 77, 2005)

"The single most important property of a cybernetic system is that it is controlled by the relationship between endogenous goals and the external environment. [...] In a complex system, overarching goals may be maintained (or attained) by means of an array of hierarchically organized subgoals that may be pursued contemporaneously, cyclically, or seriatim." (Peter Corning, "Synergy, Cybernetics, and the Evolution of Politics", 2005)

"When defining living systems, the term dynamic equilibrium is essential. It does not imply something which is steady or stable. On the contrary, it is a floating state characterized by invisible movements and preparedness for change. To be in dynamic equilibrium is adapting adjustment to balance. Homeostasis stands for the sum of all control functions creating the state of dynamic equilibrium in a healthy organism. It is the ability of the body to maintain a narrow range of internal conditions in spite of environmental changes." (Lars Skyttner, "General Systems Theory: Problems, Perspective, Practice", 2005)

"Systematic usage of the methods of modern control theory to study physical systems is a key feature of a new research area in physics that may be called cybernetical physics. The subject of cybernetical physics is focused on studying physical systems by means of feedback interactions with the environment. Its methodology heavily relies on the design methods developed in cybernetics. However, the approach of cybernetical physics differs from the conventional use of feedback in control applications (e.g., robotics, mechatronics) aimed mainly at driving a system to a prespecified position or a given trajectory." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"In physical, exponentially growing systems, there must be at least one reinforcing loop driving growth and at least one balancing feedback loop constraining growth, because no system can grow forever in a finite environment." (Donella H Meadows, "Thinking in Systems: A Primer", 2008)

"In that sense, a self-organizing system is intrinsically adaptive: it maintains its basic organization in spite of continuing changes in its environment. As noted, perturbations may even make the system more robust, by helping it to discover a more stable organization." (Francis Heylighen, "Complexity and Self-Organization", 2008)

"If universality is one of the observed characteristics of complex dynamical systems in many fields of study, a second characteristic that flows from the study of these systems is that of emergence. As self-organizing systems go about their daily business, they are constantly exchanging matter and energy with their environment, and this allows them to remain in a state that is far from equilibrium. That allows spontaneous behavior to give rise to new patterns." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"To remedy chaotic situations requires a chaotic approach, one that is non-linear, constantly morphing, and continually sharpening its competitive edge with recurring feedback loops that build upon past experiences and lessons learned. Improvement cannot be sustained without reflection. Chaos arises from myriad sources that stem from two origins: internal chaos rising within you, and external chaos being imposed upon you by the environment. The result of this push/pull effect is the disequilibrium [...]." (Jeff Boss, "Navigating Chaos: How to Find Certainty in Uncertain Situations", 2015)

More quotes on "Environment" at the-web-of-knowledge.blogspot.com.

07 February 2014

🕸Systems Engineering: Entropy (Definitions)

"The Entropy of a system is the mechanical work it can perform without communication of heat, or alteration of its total volume, all transference of heat being performed by reversible engines." (James C Maxwell, "Theory of Heat", 1899)

"Entropy is the measure of randomness." (Lincoln Barnett, "The Universe and Dr. Einstein", 1948)

"Entropy is a measure of the heat energy in a substance that has been lost and is no longer available for work. It is a measure of the deterioration of a system." (William B. Sill & Norman Hoss (Eds.), "Popular Science Encyclopedia of the Sciences", 1963)

"Entropy [...] is the amount of disorder or randomness present in any system." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"A measurement of the disorder of a data set." (Glenn J Myatt, "Making Sense of Data: A Practical Guide to Exploratory Data Analysis and Data Mining", 2006)

"[...] entropy is the amount of hidden microscopic information." (Leonard Susskind, "The Black Hole War", 2008)

"A measure of the uncertainty associated with a random variable. Entropy quantifies information in a piece of data." (Radu Mutihac, "Bayesian Neural Networks for Image Restoration" [in "Encyclopedia of Artificial Intelligence", 2009)

"Measurement that can be used in machine learning on a set of data that is to be classified. In this setting it can be defined as the amount of uncertainty or randomness (or noise) in the data. If all data is classified with the same class, the entropy of that set would be 0." (Isak Taksa et al, "Machine Learning Approach to Search Query Classification", 2009)

"A measure of uncertainty associated with the predictable value of information content. The highest information entropy is when the ambiguity or uncertainty of the outcome is the greatest." (Alex Berson & Lawrence Dubov, "Master Data Management and Data Governance", 2010)

"Refers to the inherent unknowability of data to external observers. If a bit is just as likely to be a 1 as a 0 and a user does not know which it is, then the bit contains 1 bit of entropy." (Mark S Merkow & Lakshmikanth Raghavan, "Secure and Resilient Software Development", 2010)

"The measurement of uncertainty in an outcome, or randomness in a system." (DAMA International, "The DAMA Dictionary of Data Management", 2011)

"A metric used to evaluate and describe the amount of randomness associated with a random variable."(Wenbing Zhao, "Increasing the Trustworthiness of Online Gaming Applications", 2015)

"Anti-entropy is the process of detecting differences in replicas. From a performance perspective, it is important to detect and resolve inconsistencies with a minimum amount of data exchange." (Dan Sullivan, "NoSQL for Mere Mortals®", 2015)

"Average amount of information contained in a sample drawn from a distribution or data stream. Measure of uncertainty of the source of information." (Anwesha Sengupta et al, "Alertness Monitoring System for Vehicle Drivers using Physiological Signals", 2016)

"In information theory this notion, introduced by Claude Shannon, is used to express unpredictability of information content. For instance, if a data set containing n items was divided into k groups each comprising n i items, then the entropy of such a partition is H = p 1 log( p 1 ) + … + p k log( p k ), where p i = n i / n . In case of two alternative partitions, the mutual information is a measure of the mutual dependence between these partitions." (Slawomir T Wierzchon, "Ensemble Clustering Data Mining and Databases", 2018) [where i is used as index]

"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution." ("G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)

"Lack of order or predictability; gradual decline into disorder." (Adrian Carballal et al, "Approach to Minimize Bias on Aesthetic Image Datasets", 2019)

"It is the quantity which is used to describe the amount of information which must be coded for compression algorithm." (Arockia Sukanya & Kamalanand Krishnamurthy, "Thresholding Techniques for Dental Radiographic Images: A Comparative Study", 2019)

"In the physics - rate of system´s messiness or disorder in a physical system. In the social systems theory - social entropy is a sociological theory that evaluates social behaviors using a method based on the second law of thermodynamics." (Justína Mikulášková et al, "Spiral Management: New Concept of the Social Systems Management", 2020)
Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.