Showing posts with label computing. Show all posts
Showing posts with label computing. Show all posts

28 July 2019

IT: Internet of Things (Definitions)

"A term used to describe the community or collection of people and items that use the Internet to communicate with other." (Kenneth A Shaw, "Integrated Management of Processes and Information", 2013)

"The embedding of objects with sensors, coupled with the ability of objects to communicate, driving an explosion in the growth of big data." (Brenda L Dietrich et al, "Analytics Across the Enterprise", 2014)

"The Internet of Things entails the aim of all physical or uniquely identifiable objects being connected through wired and wireless networks. In this notion, every object would be virtually represented. Connecting objects in this way offers a whole new universe of possibilities. Real-time analysis of big data streams could enhance productivity and safety of systems (for example, roadways and cars being part of the Internet of Things could help to manage traffic flow). It can also make everyday life more convenient and sustainable (such as connecting all household devices to save electricity)." (Martin Hoegl et al, "Using Thematic Thinking to Achieve Business Success, Growth, and Innovation", 2014)

"IOT refers to a network of machines that have sensors and are interconnected enabling them to collect and exchange data. This interconnection enables devices to be controlled remotely resulting in process efficiencies and lower costs." (Saumya Chaki, "Enterprise Information Management in Practice", 2015)

"An interconnected network of physical devices, vehicles, buildings, and other items embedded with sensors that gather and share data." (Jonathan Ferrar et al, "The Power of People: Learn How Successful Organizations Use Workforce Analytics To Improve Business Performance", 2017)

"Ordinary devices that are connected to the Internet at any time, anywhere, via sensors." (Jason Williamson, "Getting a Big Data Job For Dummies", 2015)

"Also referred to as IoT. Term that describes the connectivity of objects to the Internet and the ability for these objects to send and receive data from each other." (Brittany Bullard, "Style and Statistics", 2016)

"computing or 'smart' devices often with ­sensor capability and the ability to collect, share, and transfer data using the Internet." (Daniel J. Power & Ciara Heavin, "Data-Based Decision Making and Digital Transformation", 2018)

"The wide-scale deployment of small, low-power computing devices into everyday devices, such as thermostats, refrigerators, clothing, and even into people themselves to continuously monitor health." (O Sami Saydjari, "Engineering Trustworthy Systems: Get Cybersecurity Design Right the First Time", 2018)

"A network of physical objects that have, like cell phones and laptops, internet connectivity enabling automatic communication between them and any other machine connected to the internet without human intervention." (Sue Milton, "Data Privacy vs. Data Security", 2021)

"Integration of various processes such as identifying, sensing, networking, and computation." (Revathi Rajendran et al, "Convergence of AI, ML, and DL for Enabling Smart Intelligence: Artificial Intelligence, Machine Learning, Deep Learning, Internet of Things", 2021)

"It is an interdisciplinary field who is associated with the electronics and computer science. Electronics deals with the development of new sensors or hardware for IoT device and computer science deals with the development of software, protocols and cloud based solution to store the data generated form these IoT devices."  (Ajay Sharma, "Smart Agriculture Services Using Deep Learning, Big Data, and IoT", 2021)

"IoT is a network of real-world objects which consists of sensors, software, and other technologies to exchange data with the other systems over the internet." (Hari K Kondaveeti et al, "Deep Learning Applications in Agriculture: The Role of Deep Learning in Smart Agriculture", 2021)

"This refers to a system of inter-connected computing and smart devices, that are provided with unique identifiers and the ability to transfer data over a network without requiring human interaction." (Wissam Abbass et al, "Internet of Things Application for Intelligent Cities: Security Risk Assessment Challenges", 2021)

"describes the network where sensing elements such as sensors, cameras, and devices are increasingly linked together via the internet to connect, communicate and exchange information." (Accenture)

"ordinary devices that are connected to the internet at any time anywhere via sensors." (Analytics Insight)

"Technologies that enable objects and infrastructure to interact with monitoring, analytics, and control systems over internet-style networks." (Forrester)

31 December 2018

Data Science: Big Data (Just the Quotes)

"If we gather more and more data and establish more and more associations, however, we will not finally find that we know something. We will simply end up having more and more data and larger sets of correlations." (Kenneth N Waltz, "Theory of International Politics Source: Theory of International Politics", 1979)

“There are those who try to generalize, synthesize, and build models, and there are those who believe nothing and constantly call for more data. The tension between these two groups is a healthy one; science develops mainly because of the model builders, yet they need the second group to keep them honest.” (Andrew Miall, “Principles of Sedimentary Basin Analysis”, 1984)

"Big Data is data that exceeds the processing capacity of conventional database systems. The data is too big, moves too fast, or doesn’t fit the strictures of your database architectures. To gain value from this data, you must choose an alternative way to process it." (Edd Wilder-James, "What is big data?", 2012) [source]

"The secret to getting the most from Big Data isn’t found in huge server farms or massive parallel computing or in-memory algorithms. Instead, it’s in the almighty pencil." (Matt Ariker, "The One Tool You Need To Make Big Data Work: The Pencil", 2012)

"Big data is the most disruptive force this industry has seen since the introduction of the relational database." (Jeffrey Needham, "Disruptive Possibilities: How Big Data Changes Everything", 2013)

"No subjective metric can escape strategic gaming [...] The possibility of mischief is bottomless. Fighting ratings is fruitless, as they satisfy a very human need. If one scheme is beaten down, another will take its place and wear its flaws. Big Data just deepens the danger. The more complex the rating formulas, the more numerous the opportunities there are to dress up the numbers. The larger the data sets, the harder it is to audit them." (Kaiser Fung, "Numbersense: How To Use Big Data To Your Advantage", 2013)

"There is convincing evidence that data-driven decision-making and big data technologies substantially improve business performance. Data science supports data-driven decision-making - and sometimes conducts such decision-making automatically - and depends upon technologies for 'big data' storage and engineering, but its principles are separate." (Foster Provost & Tom Fawcett, "Data Science for Business", 2013)

"Our needs going forward will be best served by how we make use of not just this data but all data. We live in an era of Big Data. The world has seen an explosion of information in the past decades, so much so that people and institutions now struggle to keep pace. In fact, one of the reasons for the attachment to the simplicity of our indicators may be an inverse reaction to the sheer and bewildering volume of information most of us are bombarded by on a daily basis. […] The lesson for a world of Big Data is that in an environment with excessive information, people may gravitate toward answers that simplify reality rather than embrace the sheer complexity of it." (Zachary Karabell, "The Leading Indicators: A short history of the numbers that rule our world", 2014)

"The other buzzword that epitomizes a bias toward substitution is 'big data'. Today’s companies have an insatiable appetite for data, mistakenly believing that more data always creates more value. But big data is usually dumb data. Computers can find patterns that elude humans, but they don’t know how to compare patterns from different sources or how to interpret complex behaviors. Actionable insights can only come from a human analyst (or the kind of generalized artificial intelligence that exists only in science fiction)." (Peter Thiel & Blake Masters, "Zero to One: Notes on Startups, or How to Build the Future", 2014)

"We have let ourselves become enchanted by big data only because we exoticize technology. We’re impressed with small feats accomplished by computers alone, but we ignore big achievements from complementarity because the human contribution makes them less uncanny. Watson, Deep Blue, and ever-better machine learning algorithms are cool. But the most valuable companies in the future won’t ask what problems can be solved with computers alone. Instead, they’ll ask: how can computers help humans solve hard problems?" (Peter Thiel & Blake Masters, "Zero to One: Notes on Startups, or How to Build the Future", 2014)

"As business leaders we need to understand that lack of data is not the issue. Most businesses have more than enough data to use constructively; we just don't know how to use it. The reality is that most businesses are already data rich, but insight poor." (Bernard Marr, Big Data: Using SMART Big Data, Analytics and Metrics To Make Better Decisions and Improve Performance, 2015)

"Big data is based on the feedback economy where the Internet of Things places sensors on more and more equipment. More and more data is being generated as medical records are digitized, more stores have loyalty cards to track consumer purchases, and people are wearing health-tracking devices. Generally, big data is more about looking at behavior, rather than monitoring transactions, which is the domain of traditional relational databases. As the cost of storage is dropping, companies track more and more data to look for patterns and build predictive models." (Neil Dunlop, "Big Data", 2015)

"Big Data often seems like a meaningless buzz phrase to older database professionals who have been experiencing exponential growth in database volumes since time immemorial. There has never been a moment in the history of database management systems when the increasing volume of data has not been remarkable." (Guy Harrison, "Next Generation Databases: NoSQL, NewSQL, and Big Data", 2015)

"Dimensionality reduction is essential for coping with big data - like the data coming in through your senses every second. A picture may be worth a thousand words, but it’s also a million times more costly to process and remember. [...] A common complaint about big data is that the more data you have, the easier it is to find spurious patterns in it. This may be true if the data is just a huge set of disconnected entities, but if they’re interrelated, the picture changes." (Pedro Domingos, "The Master Algorithm", 2015)

"Science’s predictions are more trustworthy, but they are limited to what we can systematically observe and tractably model. Big data and machine learning greatly expand that scope. Some everyday things can be predicted by the unaided mind, from catching a ball to carrying on a conversation. Some things, try as we might, are just unpredictable. For the vast middle ground between the two, there’s machine learning." (Pedro Domingos, "The Master Algorithm", 2015)

"The human side of analytics is the biggest challenge to implementing big data." (Paul Gibbons, "The Science of Successful Organizational Change", 2015)

"To make progress, every field of science needs to have data commensurate with the complexity of the phenomena it studies. [...] With big data and machine learning, you can understand much more complex phenomena than before. In most fields, scientists have traditionally used only very limited kinds of models, like linear regression, where the curve you fit to the data is always a straight line. Unfortunately, most phenomena in the world are nonlinear. [...] Machine learning opens up a vast new world of nonlinear models." (Pedro Domingos, "The Master Algorithm", 2015)

"Underfitting is when a model doesn’t take into account enough information to accurately model real life. For example, if we observed only two points on an exponential curve, we would probably assert that there is a linear relationship there. But there may not be a pattern, because there are only two points to reference. [...] It seems that the best way to mitigate underfitting a model is to give it more information, but this actually can be a problem as well. More data can mean more noise and more problems. Using too much data and too complex of a model will yield something that works for that particular data set and nothing else." (Matthew Kirk, "Thoughtful Machine Learning", 2015)

"We are moving slowly into an era where Big Data is the starting point, not the end." (Pearl Zhu, "Digital Master: Debunk the Myths of Enterprise Digital Maturity", 2015)

"A popular misconception holds that the era of Big Data means the end of a need for sampling. In fact, the proliferation of data of varying quality and relevance reinforces the need for sampling as a tool to work efficiently with a variety of data, and minimize bias. Even in a Big Data project, predictive models are typically developed and piloted with samples." (Peter C Bruce & Andrew G Bruce, "Statistics for Data Scientists: 50 Essential Concepts", 2016)

"Big data is, in a nutshell, large amounts of data that can be gathered up and analyzed to determine whether any patterns emerge and to make better decisions." (Daniel Covington, Analytics: Data Science, Data Analysis and Predictive Analytics for Business, 2016)

"Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that’s something only humans can provide. We have to explicitly embed better values into our algorithms, creating Big Data models that follow our ethical lead. Sometimes that will mean putting fairness ahead of profit." (Cathy O'Neil, "Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy", 2016)

"While Big Data, when managed wisely, can provide important insights, many of them will be disruptive. After all, it aims to find patterns that are invisible to human eyes. The challenge for data scientists is to understand the ecosystems they are wading into and to present not just the problems but also their possible solutions." (Cathy O'Neil, "Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy", 2016)

"Big Data allows us to meaningfully zoom in on small segments of a dataset to gain new insights on who we are." (Seth Stephens-Davidowitz, "Everybody Lies: What the Internet Can Tell Us About Who We Really Are", 2017)

"Effects without an understanding of the causes behind them, on the other hand, are just bunches of data points floating in the ether, offering nothing useful by themselves. Big Data is information, equivalent to the patterns of light that fall onto the eye. Big Data is like the history of stimuli that our eyes have responded to. And as we discussed earlier, stimuli are themselves meaningless because they could mean anything. The same is true for Big Data, unless something transformative is brought to all those data sets… understanding." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

"The term [Big Data] simply refers to sets of data so immense that they require new methods of mathematical analysis, and numerous servers. Big Data - and, more accurately, the capacity to collect it - has changed the way companies conduct business and governments look at problems, since the belief wildly trumpeted in the media is that this vast repository of information will yield deep insights that were previously out of reach." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

"There are other problems with Big Data. In any large data set, there are bound to be inconsistencies, misclassifications, missing data - in other words, errors, blunders, and possibly lies. These problems with individual items occur in any data set, but they are often hidden in a large mass of numbers even when these numbers are generated out of computer interactions." (David S Salsburg, "Errors, Blunders, and Lies: How to Tell the Difference", 2017)

"Just as they did thirty years ago, machine learning programs (including those with deep neural networks) operate almost entirely in an associational mode. They are driven by a stream of observations to which they attempt to fit a function, in much the same way that a statistician tries to fit a line to a collection of points. Deep neural networks have added many more layers to the complexity of the fitted function, but raw data still drives the fitting process. They continue to improve in accuracy as more data are fitted, but they do not benefit from the 'super-evolutionary speedup'."  (Judea Pearl & Dana Mackenzie, "The Book of Why: The new science of cause and effect", 2018)

"One of the biggest myths is the belief that data science is an autonomous process that we can let loose on our data to find the answers to our problems. In reality, data science requires skilled human oversight throughout the different stages of the process. [...] The second big myth of data science is that every data science project needs big data and needs to use deep learning. In general, having more data helps, but having the right data is the more important requirement. [...] A third data science myth is that modern data science software is easy to use, and so data science is easy to do. [...] The last myth about data science [...] is the belief that data science pays for itself quickly. The truth of this belief depends on the context of the organization. Adopting data science can require significant investment in terms of developing data infrastructure and hiring staff with data science expertise. Furthermore, data science will not give positive results on every project." (John D Kelleher & Brendan Tierney, "Data Science", 2018)

"Apart from the technical challenge of working with the data itself, visualization in big data is different because showing the individual observations is just not an option. But visualization is essential here: for analysis to work well, we have to be assured that patterns and errors in the data have been spotted and understood. That is only possible by visualization with big data, because nobody can look over the data in a table or spreadsheet." (Robert Grant, "Data Visualization: Charts, Maps and Interactive Graphics", 2019)

"With the growing availability of massive data sets and user-friendly analysis software, it might be thought that there is less need for training in statistical methods. This would be naïve in the extreme. Far from freeing us from the need for statistical skills, bigger data and the rise in the number and complexity of scientific studies makes it even more difficult to draw appropriate conclusions. More data means that we need to be even more aware of what the evidence is actually worth." (David Spiegelhalter, "The Art of Statistics: Learning from Data", 2019)

"Big data is revolutionizing the world around us, and it is easy to feel alienated by tales of computers handing down decisions made in ways we don’t understand. I think we’re right to be concerned. Modern data analytics can produce some miraculous results, but big data is often less trustworthy than small data. Small data can typically be scrutinized; big data tends to be locked away in the vaults of Silicon Valley. The simple statistical tools used to analyze small datasets are usually easy to check; pattern-recognizing algorithms can all too easily be mysterious and commercially sensitive black boxes." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

"Making big data work is harder than it seems. Statisticians have spent the past two hundred years figuring out what traps lie in wait when we try to understand the world through data. The data are bigger, faster, and cheaper these days, but we must not pretend that the traps have all been made safe. They have not." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

"Many people have strong intuitions about whether they would rather have a vital decision about them made by algorithms or humans. Some people are touchingly impressed by the capabilities of the algorithms; others have far too much faith in human judgment. The truth is that sometimes the algorithms will do better than the humans, and sometimes they won’t. If we want to avoid the problems and unlock the promise of big data, we’re going to need to assess the performance of the algorithms on a case-by-case basis. All too often, this is much harder than it should be. […] So the problem is not the algorithms, or the big datasets. The problem is a lack of scrutiny, transparency, and debate." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

"The problem is the hype, the notion that something magical will emerge if only we can accumulate data on a large enough scale. We just need to be reminded: Big data is not better; it’s just bigger. And it certainly doesn’t speak for itself." (Carl T Bergstrom & Jevin D West, "Calling Bullshit: The Art of Skepticism in a Data-Driven World", 2020)

"[...] the focus on Big Data AI seems to be an excuse to put forth a number of vague and hand-waving theories, where the actual details and the ultimate success of neuroscience is handed over to quasi- mythological claims about the powers of large datasets and inductive computation. Where humans fail to illuminate a complicated domain with testable theory, machine learning and big data supposedly can step in and render traditional concerns about finding robust theories. This seems to be the logic of Data Brain efforts today. (Erik J Larson, "The Myth of Artificial Intelligence: Why Computers Can’t Think the Way We Do", 2021)

"Visualizations can remove the background noise from enormous sets of data so that only the most important points stand out to the intended audience. This is particularly important in the era of big data. The more data there is, the more chance for noise and outliers to interfere with the core concepts of the data set." (Kate Strachnyi, "ColorWise: A Data Storyteller’s Guide to the Intentional Use of Color", 2023)

"Visualisation is fundamentally limited by the number of pixels you can pump to a screen. If you have big data, you have way more data than pixels, so you have to summarise your data. Statistics gives you lots of really good tools for this." (Hadley Wickham)

17 December 2018

Data Science: Mathematical Models (Just the Quotes)

"Experience teaches that one will be led to new discoveries almost exclusively by means of special mechanical models." (Ludwig Boltzmann, "Lectures on Gas Theory", 1896)

"If the system exhibits a structure which can be represented by a mathematical equivalent, called a mathematical model, and if the objective can be also so quantified, then some computational method may be evolved for choosing the best schedule of actions among alternatives. Such use of mathematical models is termed mathematical programming."  (George Dantzig, "Linear Programming and Extensions", 1959)

“In fact, the construction of mathematical models for various fragments of the real world, which is the most essential business of the applied mathematician, is nothing but an exercise in axiomatics.” (Marshall Stone, cca 1960)

"[...] sciences do not try to explain, they hardly even try to interpret, they mainly make models. By a model is meant a mathematical construct which, with the addition of certain verbal interpretations, describes observed phenomena. The justification of such a mathematical construct is solely and precisely that it is expected to work - that is, correctly to describe phenomena from a reasonably wide area. Furthermore, it must satisfy certain aesthetic criteria - that is, in relation to how much it describes, it must be rather simple.” (John von Neumann, “Method in the physical sciences”, 1961)

“Mathematical statistics provides an exceptionally clear example of the relationship between mathematics and the external world. The external world provides the experimentally measured distribution curve; mathematics provides the equation (the mathematical model) that corresponds to the empirical curve. The statistician may be guided by a thought experiment in finding the corresponding equation.” (Marshall J Walker, “The Nature of Scientific Thought”, 1963)

"Thus, the construction of a mathematical model consisting of certain basic equations of a process is not yet sufficient for effecting optimal control. The mathematical model must also provide for the effects of random factors, the ability to react to unforeseen variations and ensure good control despite errors and inaccuracies." (Yakov Khurgin, "Did You Say Mathematics?", 1974)

"A mathematical model is any complete and consistent set of mathematical equations which are designed to correspond to some other entity, its prototype. The prototype may be a physical, biological, social, psychological or conceptual entity, perhaps even another mathematical model." (Rutherford Aris, "Mathematical Modelling", 1978)

"Mathematical model making is an art. If the model is too small, a great deal of analysis and numerical solution can be done, but the results, in general, can be meaningless. If the model is too large, neither analysis nor numerical solution can be carried out, the interpretation of the results is in any case very difficult, and there is great difficulty in obtaining the numerical values of the parameters needed for numerical results." (Richard E Bellman, "Eye of the Hurricane: An Autobiography", 1984)

“Theoretical scientists, inching away from the safe and known, skirting the point of no return, confront nature with a free invention of the intellect. They strip the discovery down and wire it into place in the form of mathematical models or other abstractions that define the perceived relation exactly. The now-naked idea is scrutinized with as much coldness and outward lack of pity as the naturally warm human heart can muster. They try to put it to use, devising experiments or field observations to test its claims. By the rules of scientific procedure it is then either discarded or temporarily sustained. Either way, the central theory encompassing it grows. If the abstractions survive they generate new knowledge from which further exploratory trips of the mind can be planned. Through the repeated alternation between flights of the imagination and the accretion of hard data, a mutual agreement on the workings of the world is written, in the form of natural law.” (Edward O Wilson, “Biophilia”, 1984)

“The usual approach of science of constructing a mathematical model cannot answer the questions of why there should be a universe for the model to describe. Why does the universe go to all the bother of existing?” (Stephen Hawking, "A Brief History of Time", 1988)

“Mathematical modeling is about rules - the rules of reality. What distinguishes a mathematical model from, say, a poem, a song, a portrait or any other kind of ‘model’, is that the mathematical model is an image or picture of reality painted with logical symbols instead of with words, sounds or watercolors.” (John L Casti, "Reality Rules, The Fundamentals", 1992)

“Pedantry and sectarianism aside, the aim of theoretical physics is to construct mathematical models such as to enable us, from the use of knowledge gathered in a few observations, to predict by logical processes the outcomes in many other circumstances. Any logically sound theory satisfying this condition is a good theory, whether or not it be derived from ‘ultimate’ or ‘fundamental’ truth.” (Clifford Truesdell & Walter Noll, “The Non-Linear Field Theories of Mechanics” 2nd Ed., 1992)

"Nature behaves in ways that look mathematical, but nature is not the same as mathematics. Every mathematical model makes simplifying assumptions; its conclusions are only as valid as those assumptions. The assumption of perfect symmetry is excellent as a technique for deducing the conditions under which symmetry-breaking is going to occur, the general form of the result, and the range of possible behaviour. To deduce exactly which effect is selected from this range in a practical situation, we have to know which imperfections are present." (Ian Stewart & Martin Golubitsky, "Fearful Symmetry", 1992)

“A model is an imitation of reality and a mathematical model is a particular form of representation. We should never forget this and get so distracted by the model that we forget the real application which is driving the modelling. In the process of model building we are translating our real world problem into an equivalent mathematical problem which we solve and then attempt to interpret. We do this to gain insight into the original real world situation or to use the model for control, optimization or possibly safety studies." (Ian T Cameron & Katalin Hangos, “Process Modelling and Model Analysis”, 2001)

"Formulation of a mathematical model is the first step in the process of analyzing the behaviour of any real system. However, to produce a useful model, one must first adopt a set of simplifying assumptions which have to be relevant in relation to the physical features of the system to be modelled and to the specific information one is interested in. Thus, the aim of modelling is to produce an idealized description of reality, which is both expressible in a tractable mathematical form and sufficiently close to reality as far as the physical mechanisms of interest are concerned." (Francois Axisa, "Discrete Systems" Vol. I, 2001)

"[…] interval mathematics and fuzzy logic together can provide a promising alternative to mathematical modeling for many physical systems that are too vague or too complicated to be described by simple and crisp mathematical formulas or equations. When interval mathematics and fuzzy logic are employed, the interval of confidence and the fuzzy membership functions are used as approximation measures, leading to the so-called fuzzy systems modeling." (Guanrong Chen & Trung Tat Pham, "Introduction to Fuzzy Sets, Fuzzy Logic, and Fuzzy Control Systems", 2001)

"Modeling, in a general sense, refers to the establishment of a description of a system (a plant, a process, etc.) in mathematical terms, which characterizes the input-output behavior of the underlying system. To describe a physical system […] we have to use a mathematical formula or equation that can represent the system both qualitatively and quantitatively. Such a formulation is a mathematical representation, called a mathematical model, of the physical system." (Guanrong Chen & Trung Tat Pham, "Introduction to Fuzzy Sets, Fuzzy Logic, and Fuzzy Control Systems", 2001)

“What is a mathematical model? One basic answer is that it is the formulation in mathematical terms of the assumptions and their consequences believed to underlie a particular ‘real world’ problem. The aim of mathematical modeling is the practical application of mathematics to help unravel the underlying mechanisms involved in, for example, economic, physical, biological, or other systems and processes.” (John A Adam, “Mathematics in Nature”, 2003)

“Mathematical modeling is as much ‘art’ as ‘science’: it requires the practitioner to (i) identify a so-called ‘real world’ problem (whatever the context may be); (ii) formulate it in mathematical terms (the ‘word problem’ so beloved of undergraduates); (iii) solve the problem thus formulated (if possible; perhaps approximate solutions will suffice, especially if the complete problem is intractable); and (iv) interpret the solution in the context of the original problem.” (John A Adam, “Mathematics in Nature”, 2003)

“Mathematical modeling is the application of mathematics to describe real-world problems and investigating important questions that arise from it.” (Sandip Banerjee, “Mathematical Modeling: Models, Analysis and Applications”, 2014)

“A mathematical model is a mathematical description (often by means of a function or an equation) of a real-world phenomenon such as the size of a population, the demand for a product, the speed of a falling object, the concentration of a product in a chemical reaction, the life expectancy of a person at birth, or the cost of emission reductions. The purpose of the model is to understand the phenomenon and perhaps to make predictions about future behavior. [...] A mathematical model is never a completely accurate representation of a physical situation - it is an idealization." (James Stewart, “Calculus: Early Transcedentals” 8th Ed., 2016)

"Machine learning is about making computers learn and perform tasks better based on past historical data. Learning is always based on observations from the data available. The emphasis is on making computers build mathematical models based on that learning and perform tasks automatically without the intervention of humans." (Umesh R Hodeghatta & Umesha Nayak, "Business Analytics Using R: A Practical Approach", 2017)

"Mathematical modeling is the modern version of both applied mathematics and theoretical physics. In earlier times, one proposed not a model but a theory. By talking today of a model rather than a theory, one acknowledges that the way one studies the phenomenon is not unique; it could also be studied other ways. One's model need not claim to be unique or final. It merits consideration if it provides an insight that isn't better provided by some other model." (Reuben Hersh, ”Mathematics as an Empirical Phenomenon, Subject to Modeling”, 2017)

14 December 2018

Data Science: Algorithms (Just the Quotes)

"Mathematics is an aspect of culture as well as a collection of algorithms." (Carl B Boyer, "The History of the Calculus and Its Conceptual Development", 1959)

"Design problems - generating or discovering alternatives - are complex largely because they involve two spaces, an action space and a state space, that generally have completely different structures. To find a design requires mapping the former of these on the latter. For many, if not most, design problems in the real world systematic algorithms are not known that guarantee solutions with reasonable amounts of computing effort. Design uses a wide range of heuristic devices - like means-end analysis, satisficing, and the other procedures that have been outlined - that have been found by experience to enhance the efficiency of search. Much remains to be learned about the nature and effectiveness of these devices." (Herbert A Simon, "The Logic of Heuristic Decision Making", [in "The Logic of Decision and Action"], 1966)

"An algorithm must be seen to be believed, and the best way to learn what an algorithm is all about is to try it." (Donald E Knuth, The Art of Computer Programming Vol. I, 1968)

"Scientific laws give algorithms, or procedures, for determining how systems behave. The computer program is a medium in which the algorithms can be expressed and applied. Physical objects and mathematical structures can be represented as numbers and symbols in a computer, and a program can be written to manipulate them according to the algorithms. When the computer program is executed, it causes the numbers and symbols to be modified in the way specified by the scientific laws. It thereby allows the consequences of the laws to be deduced." (Stephen Wolfram, "Computer Software in Science and Mathematics", 1984)

"Algorithmic complexity theory and nonlinear dynamics together establish the fact that determinism reigns only over a quite finite domain; outside this small haven of order lies a largely uncharted, vast wasteland of chaos." (Joseph Ford, "Progress in Chaotic Dynamics: Essays in Honor of Joseph Ford's 60th Birthday", 1988)

"On this view, we recognize science to be the search for algorithmic compressions. We list sequences of observed data. We try to formulate algorithms that compactly represent the information content of those sequences. Then we test the correctness of our hypothetical abbreviations by using them to predict the next terms in the string. These predictions can then be compared with the future direction of the data sequence. Without the development of algorithmic compressions of data all science would be replaced by mindless stamp collecting - the indiscriminate accumulation of every available fact. Science is predicated upon the belief that the Universe is algorithmically compressible and the modern search for a Theory of Everything is the ultimate expression of that belief, a belief that there is an abbreviated representation of the logic behind the Universe's properties that can be written down in finite form by human beings." (John D Barrow, New Theories of Everything", 1991)

"Algorithms are a set of procedures to generate the answer to a problem." (Stuart Kauffman, "At Home in the Universe: The Search for Laws of Complexity", 1995)

"Let us regard a proof of an assertion as a purely mechanical procedure using precise rules of inference starting with a few unassailable axioms. This means that an algorithm can be devised for testing the validity of an alleged proof simply by checking the successive steps of the argument; the rules of inference constitute an algorithm for generating all the statements that can be deduced in a finite number of steps from the axioms." (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"The vast majority of information that we have on most processes tends to be nonnumeric and nonalgorithmic. Most of the information is fuzzy and linguistic in form." (Timothy J Ross & W Jerry Parkinson, "Fuzzy Set Theory, Fuzzy Logic, and Fuzzy Systems", 2002)

"Knowledge is encoded in models. Models are synthetic sets of rules, and pictures, and algorithms providing us with useful representations of the world of our perceptions and of their patterns." (Didier Sornette, "Why Stock Markets Crash - Critical Events in Complex Systems", 2003)

"Swarm Intelligence can be defined more precisely as: Any attempt to design algorithms or distributed problem-solving methods inspired by the collective behavior of the social insect colonies or other animal societies. The main properties of such systems are flexibility, robustness, decentralization and self-organization." ("Swarm Intelligence in Data Mining", Ed. Ajith Abraham et al, 2006)

"The burgeoning field of computer science has shifted our view of the physical world from that of a collection of interacting material particles to one of a seething network of information. In this way of looking at nature, the laws of physics are a form of software, or algorithm, while the material world - the hardware - plays the role of a gigantic computer." (Paul C W Davies, "Laying Down the Laws", New Scientist, 2007)

"An algorithm refers to a successive and finite procedure by which it is possible to solve a certain problem. Algorithms are the operational base for most computer programs. They consist of a series of instructions that, thanks to programmers’ prior knowledge about the essential characteristics of a problem that must be solved, allow a step-by-step path to the solution." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"Programming is a science dressed up as art, because most of us don’t understand the physics of software and it’s rarely, if ever, taught. The physics of software is not algorithms, data structures, languages, and abstractions. These are just tools we make, use, and throw away. The real physics of software is the physics of people. Specifically, it’s about our limitations when it comes to complexity and our desire to work together to solve large problems in pieces. This is the science of programming: make building blocks that people can understand and use easily, and people will work together to solve the very largest problems." (Pieter Hintjens, "ZeroMQ: Messaging for Many Applications", 2012)

"These nature-inspired algorithms gradually became more and more attractive and popular among the evolutionary computation research community, and together they were named swarm intelligence, which became the little brother of the major four evolutionary computation algorithms." (Yuhui Shi, "Emerging Research on Swarm Intelligence and Algorithm Optimization", Information Science Reference, 2014)

"[...] algorithms, which are abstract or idealized process descriptions that ignore details and practicalities. An algorithm is a precise and unambiguous recipe. It’s expressed in terms of a fixed set of basic operations whose meanings are completely known and specified. It spells out a sequence of steps using those operations, with all possible situations covered, and it’s guaranteed to stop eventually." (Brian W Kernighan, "Understanding the Digital World", 2017)

"An algorithm is the computer science version of a careful, precise, unambiguous recipe or tax form, a sequence of steps that is guaranteed to compute a result correctly." (Brian W Kernighan, "Understanding the Digital World", 2017)

"Again, classical statistics only summarizes data, so it does not provide even a language for asking [a counterfactual] question. Causal inference provides a notation and, more importantly, offers a solution. As with predicting the effect of interventions [...], in many cases we can emulate human retrospective thinking with an algorithm that takes what we know about the observed world and produces an answer about the counterfactual world." (Judea Pearl & Dana Mackenzie, "The Book of Why: The new science of cause and effect", 2018)

"Algorithms describe the solution to a problem in terms of the data needed to represent the  problem instance and a set of steps necessary to produce the intended result." (Bradley N Miller et al, "Python Programming in Context", 2019)

"An algorithm, meanwhile, is a step-by-step recipe for performing a series of actions, and in most cases 'algorithm' means simply 'computer program'." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

"Big data is revolutionizing the world around us, and it is easy to feel alienated by tales of computers handing down decisions made in ways we don’t understand. I think we’re right to be concerned. Modern data analytics can produce some miraculous results, but big data is often less trustworthy than small data. Small data can typically be scrutinized; big data tends to be locked away in the vaults of Silicon Valley. The simple statistical tools used to analyze small datasets are usually easy to check; pattern-recognizing algorithms can all too easily be mysterious and commercially sensitive black boxes." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

"Each of us is sweating data, and those data are being mopped up and wrung out into oceans of information. Algorithms and large datasets are being used for everything from finding us love to deciding whether, if we are accused of a crime, we go to prison before the trial or are instead allowed to post bail. We all need to understand what these data are and how they can be exploited." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

"Many people have strong intuitions about whether they would rather have a vital decision about them made by algorithms or humans. Some people are touchingly impressed by the capabilities of the algorithms; others have far too much faith in human judgment. The truth is that sometimes the algorithms will do better than the humans, and sometimes they won’t. If we want to avoid the problems and unlock the promise of big data, we’re going to need to assess the performance of the algorithms on a case-by-case basis. All too often, this is much harder than it should be. […] So the problem is not the algorithms, or the big datasets. The problem is a lack of scrutiny, transparency, and debate." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

More quotes on "Algorithms" at the-web-of-knowledge.blogspot.com.

23 September 2018

Data Science: On Computation (Just the Quotes)

"If the system exhibits a structure which can be represented by a mathematical equivalent, called a mathematical model, and if the objective can be also so quantified, then some computational method may be evolved for choosing the best schedule of actions among alternatives. Such use of mathematical models is termed mathematical programming."  (George Dantzig, "Linear Programming and Extensions", 1959)

"Computers do not decrease the need for mathematical analysis, but rather greatly increase this need. They actually extend the use of analysis into the fields of computers and computation, the former area being almost unknown until recently, the latter never having been as intensively investigated as its importance warrants. Finally, it is up to the user of computational equipment to define his needs in terms of his problems, In any case, computers can never eliminate the need for problem-solving through human ingenuity and intelligence." (Richard E Bellman & Paul Brock, "On the Concepts of a Problem and Problem-Solving", American Mathematical Monthly 67, 1960)

"Cellular automata are discrete dynamical systems with simple construction but complex self-organizing behaviour. Evidence is presented that all one-dimensional cellular automata fall into four distinct universality classes. Characterizations of the structures generated in these classes are discussed. Three classes exhibit behaviour analogous to limit points, limit cycles and chaotic attractors. The fourth class is probably capable of universal computation, so that properties of its infinite time behaviour are undecidable." (Stephen Wolfram, "Nonlinear Phenomena, Universality and complexity in cellular automata", Physica 10D, 1984)

"The formal structure of a decision problem in any area can be put into four parts: (1) the choice of an objective function denning the relative desirability of different outcomes; (2) specification of the policy alternatives which are available to the agent, or decisionmaker, (3) specification of the model, that is, empirical relations that link the objective function, or the variables that enter into it, with the policy alternatives and possibly other variables; and (4) computational methods for choosing among the policy alternatives that one which performs best as measured by the objective function." (Kenneth Arrow, "The Economics of Information", 1984)

"In spite of the insurmountable computational limits, we continue to pursue the many problems that possess the characteristics of organized complexity. These problems are too important for our well being to give up on them. The main challenge in pursuing these problems narrows down fundamentally to one question: how to deal with systems and associated problems whose complexities are beyond our information processing limits? That is, how can we deal with these problems if no computational power alone is sufficient?"  (George Klir, "Fuzzy sets and fuzzy logic", 1995)

"Small changes in the initial conditions in a chaotic system produce dramatically different evolutionary histories. It is because of this sensitivity to initial conditions that chaotic systems are inherently unpredictable. To predict a future state of a system, one has to be able to rely on numerical calculations and initial measurements of the state variables. Yet slight errors in measurement combined with extremely small computational errors (from roundoff or truncation) make prediction impossible from a practical perspective. Moreover, small initial errors in prediction grow exponentially in chaotic systems as the trajectories evolve. Thus, theoretically, prediction may be possible with some chaotic processes if one is interested only in the movement between two relatively close points on a trajectory. When longer time intervals are involved, the situation becomes hopeless."(Courtney Brown, "Chaos and Catastrophe Theories", 1995)

 "An artificial neural network (or simply a neural network) is a biologically inspired computational model that consists of processing elements (neurons) and connections between them, as well as of training and recall algorithms." (Nikola K Kasabov, "Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering", 1996)

"In science, it is a long-standing tradition to deal with perceptions by converting them into measurements. But what is becoming increasingly evident is that, to a much greater extent than is generally recognized, conversion of perceptions into measurements is infeasible, unrealistic or counter-productive. With the vast computational power at our command, what is becoming feasible is a counter-traditional move from measurements to perceptions. […] To be able to compute with perceptions it is necessary to have a means of representing their meaning in a way that lends itself to computation." (Lotfi A Zadeh, "The Birth and Evolution of Fuzzy Logic: A personal perspective", 1999)

"Theories of choice are at best approximate and incomplete. One reason for this pessimistic assessment is that choice is a constructive and contingent process. When faced with a complex problem, people employ a variety of heuristic procedures in order to simplify the representation and the evaluation of prospects. These procedures include computational shortcuts and editing operations, such as eliminating common components and discarding nonessential differences. The heuristics of choice do not readily lend themselves to formal analysis because their application depends on the formulation of the problem, the method of elicitation, and the context of choice." (Amos Tversky & Daniel Kahneman, "Advances in Prospect Theory: Cumulative Representation of Uncertainty" [in "Choices, Values, and Frames"], 2000)

"Prime numbers belong to an exclusive world of intellectual conceptions. We speak of those marvelous notions that enjoy simple, elegant description, yet lead to extreme - one might say unthinkable - complexity in the details. The basic notion of primality can be accessible to a child, yet no human mind harbors anything like a complete picture. In modern times, while theoreticians continue to grapple with the profundity of the prime numbers, vast toil and resources have been directed toward the computational aspect, the task of finding, characterizing, and applying the primes in other domains." (Richard Crandall & Carl Pomerance, "Prime Numbers: A Computational Perspective", 2001)

"Complexity Theory is concerned with the study of the intrinsic complexity of computational tasks. Its 'final' goals include the determination of the complexity of any well-defined task. Additional goals include obtaining an understanding of the relations between various computational phenomena (e.g., relating one fact regarding computational complexity to another). Indeed, we may say that the former type of goal is concerned with absolute answers regarding specific computational phenomena, whereas the latter type is concerned with questions regarding the relation between computational phenomena." (Oded Goldreich, "Computational Complexity: A Conceptual Perspective", 2008)

"Granular computing is a general computation theory for using granules such as subsets, classes, objects, clusters, and elements of a universe to build an efficient computational model for complex applications with huge amounts of data, information, and knowledge. Granulation of an object a leads to a collection of granules, with a granule being a clump of points (objects) drawn together by indiscernibility, similarity, proximity, or functionality. In human reasoning and concept formulation, the granules and the values of their attributes are fuzzy rather than crisp. In this perspective, fuzzy information granulation may be viewed as a mode of generalization, which can be applied to any concept, method, or theory." (Salvatore Greco et al, "Granular Computing and Data Mining for Ordered Data: The Dominance-Based Rough Set Approach", 2009)

"How are we to explain the contrast between the matter-of-fact way in which v-1 and other imaginary numbers are accepted today and the great difficulty they posed for learned mathematicians when they first appeared on the scene? One possibility is that mathematical intuitions have evolved over the centuries and people are generally more willing to see mathematics as a matter of manipulating symbols according to rules and are less insistent on interpreting all symbols as representative of one or another aspect of physical reality. Another, less self-congratulatory possibility is that most of us are content to follow the computational rules we are taught and do not give a lot of thought to rationales." (Raymond S Nickerson, "Mathematical Reasoning: Patterns, Problems, Conjectures, and Proofs", 2009)

"It should also be noted that the novel information generated by interactions in complex systems limits their predictability. Without randomness, complexity implies a particular non-determinism characterized by computational irreducibility. In other words, complex phenomena cannot be known a priori." (Carlos Gershenson, "Complexity", 2011)

"The notion of emergence is used in a variety of disciplines such as evolutionary biology, the philosophy of mind and sociology, as well as in computational and complexity theory. It is associated with non-reductive naturalism, which claims that a hierarchy of levels of reality exist. While the emergent level is constituted by the underlying level, it is nevertheless autonomous from the constituting level. As a naturalistic theory, it excludes non-natural explanations such as vitalistic forces or entelechy. As non-reductive naturalism, emergence theory claims that higher-level entities cannot be explained by lower-level entities." (Martin Neumann, "An Epistemological Gap in Simulation Technologies and the Science of Society", 2011)

"Black Swans (capitalized) are large-scale unpredictable and irregular events of massive consequence - unpredicted by a certain observer, and such un - predictor is generally called the 'turkey' when he is both surprised and harmed by these events. [...] Black Swans hijack our brains, making us feel we 'sort of' or 'almost' predicted them, because they are retrospectively explainable. We don’t realize the role of these Swans in life because of this illusion of predictability. […] An annoying aspect of the Black Swan problem - in fact the central, and largely missed, point - is that the odds of rare events are simply not computable." (Nassim N Taleb, "Antifragile: Things that gain from disorder", 2012)

"[…] there exists a close relation between design analysis of algorithm and computational complexity theory. The former is related to the analysis of the resources (time and/or space) utilized by a particular algorithm to solve a problem and the later is related to a more general question about all possible algorithms that could be used to solve the same problem. There are different types of time complexity for different algorithms." (Shyamalendu Kandar, "Introduction to Automata Theory, Formal Languages and Computation", 2013)

"These nature-inspired algorithms gradually became more and more attractive and popular among the evolutionary computation research community, and together they were named swarm intelligence, which became the little brother of the major four evolutionary computation algorithms." (Yuhui Shi, "Emerging Research on Swarm Intelligence and Algorithm Optimization", Information Science Reference, 2014)

"The higher the dimension, in other words, the higher the number of possible interactions, and the more disproportionally difficult it is to understand the macro from the micro, the general from the simple units. This disproportionate increase of computational demands is called the curse of dimensionality." (Nassim N Taleb, "Skin in the Game: Hidden Asymmetries in Daily Life", 2018)

"Computational complexity theory, or just complexity theory, is the study of the difficulty of computational problems. Rather than focusing on specific algorithms, complexity theory focuses on problems." (Rod Stephens, "Essential Algorithms" 2nd Ed., 2019)

06 May 2018

Data Science: Swarm Intelligence (Definitions)

"Swarm systems generate novelty for three reasons: (1) They are 'sensitive to initial conditions' - a scientific shorthand for saying that the size of the effect is not proportional to the size of the cause - so they can make a surprising mountain out of a molehill. (2) They hide countless novel possibilities in the exponential combinations of many interlinked individuals. (3) They don’t reckon individuals, so therefore individual variation and imperfection can be allowed. In swarm systems with heritability, individual variation and imperfection will lead to perpetual novelty, or what we call evolution." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Dumb parts, properly connected into a swarm, yield smart results." (Kevin Kelly, "New Rules for the New Economy", 1999)

"It is, however, fair to say that very few applications of swarm intelligence have been developed. One of the main reasons for this relative lack of success resides in the fact that swarm-intelligent systems are hard to 'program', because the paths to problem solving are not predefined but emergent in these systems and result from interactions among individuals and between individuals and their environment as much as from the behaviors of the individuals themselves. Therefore, using a swarm-intelligent system to solve a problem requires a thorough knowledge not only of what individual behaviors must be implemented but also of what interactions are needed to produce such or such global behavior." (Eric Bonabeau et al, "Swarm Intelligence: From Natural to Artificial Systems", 1999)

"Just what valuable insights do ants, bees, and other social insects hold? Consider termites. Individually, they have meager intelligence. And they work with no supervision. Yet collectively they build mounds that are engineering marvels, able to maintain ambient temperature and comfortable levels of oxygen and carbon dioxide even as the nest grows. Indeed, for social insects teamwork is largely self-organized, coordinated primarily through the interactions of individual colony members. Together they can solve difficult problems (like choosing the shortest route to a food source from myriad possible pathways) even though each interaction might be very simple (one ant merely following the trail left by another). The collective behavior that emerges from a group of social insects has been dubbed 'swarm intelligence'." (Eric Bonabeau & Christopher Meyer, Swarm Intelligence: A Whole New Way to Think About Business, Harvard Business Review, 2001)

"[…] swarm intelligence is becoming a valuable tool for optimizing the operations of various businesses. Whether similar gains will be made in helping companies better organize themselves and develop more effective strategies remains to be seen. At the very least, though, the field provides a fresh new framework for solving such problems, and it questions the wisdom of certain assumptions regarding the need for employee supervision through command-and-control management. In the future, some companies could build their entire businesses from the ground up using the principles of swarm intelligence, integrating the approach throughout their operations, organization, and strategy. The result: the ultimate self-organizing enterprise that could adapt quickly - and instinctively - to fast-changing markets." (Eric Bonabeau & Christopher Meyer, "Swarm Intelligence: A Whole New Way to Think About Business", Harvard Business Review, 2001)

"Swarm Intelligence can be defined more precisely as: Any attempt to design algorithms or distributed problem-solving methods inspired by the collective behavior of the social insect colonies or other animal societies. The main properties of such systems are flexibility, robustness, decentralization and self-organization." (Ajith Abraham et al, "Swarm Intelligence in Data Mining", 2006)

"Swarm intelligence can be effective when applied to highly complicated problems with many nonlinear factors, although it is often less effective than the genetic algorithm approach discussed later in this chapter. Swarm intelligence is related to swarm optimization […]. As with swarm intelligence, there is some evidence that at least some of the time swarm optimization can produce solutions that are more robust than genetic algorithms. Robustness here is defined as a solution’s resistance to performance degradation when the underlying variables are changed." (Michael J North & Charles M Macal, "Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation", 2007)

[swarm intelligence] "Refers to a class of algorithms inspired by the collective behaviour of insect swarms, ant colonies, the flocking behaviour of some bird species, or the herding behaviour of some mammals, such that the behaviour of the whole can be considered as exhibiting a rudimentary form of 'intelligence'." (John Fulcher, "Intelligent Information Systems", 2009)

"The property of a system whereby the collective behaviors of unsophisticated agents interacting locally with their environment cause coherent functional global patterns to emerge." (M L Gavrilova, "Adaptive Algorithms for Intelligent Geometric Computing", 2009) 

[swarm intelligence] "Is a discipline that deals with natural and artificial systems composed of many individuals that coordinate using decentralized control and self-organization. In particular, SI focuses on the collective behaviors that result from the local interactions of the individuals with each other and with their environment." (Elina Pacini et al, "Schedulers Based on Ant Colony Optimization for Parameter Sweep Experiments in Distributed Environments", 2013). 

"Swarm intelligence (SI) is a branch of computational intelligence that discusses the collective behavior emerging within self-organizing societies of agents. SI was inspired by the observation of the collective behavior in societies in nature such as the movement of birds and fish. The collective behavior of such ecosystems, and their artificial counterpart of SI, is not encoded within the set of rules that determines the movement of each isolated agent, but it emerges through the interaction of multiple agents." (Maximos A Kaliakatsos-Papakostas et al, "Intelligent Music Composition", 2013)

"Collective intelligence of societies of biological (social animals) or artificial (robots, computer agents) individuals. In artificial intelligence, it gave rise to a computational paradigm based on decentralisation, self-organisation, local interactions, and collective emergent behaviours." (D T Pham & M Castellani, "The Bees Algorithm as a Biologically Inspired Optimisation Method", 2015)

"It is the field of artificial intelligence in which the population is in the form of agents which search in a parallel fashion with multiple initialization points. The swarm intelligence-based algorithms mimic the physical and natural processes for mathematical modeling of the optimization algorithm. They have the properties of information interchange and non-centralized control structure." (Sajad A Rather & P Shanthi Bala, "Analysis of Gravitation-Based Optimization Algorithms for Clustering and Classification", 2020)

"It [swarm intelligence] is the discipline dealing with natural and artificial systems consisting of many individuals who coordinate through decentralized monitoring and self-organization." (Mehmet A Cifci, "Optimizing WSNs for CPS Using Machine Learning Techniques", 2021)

Resources:
More quotes on "Swarm Intelligence" at the-web-of-knowledge.blogspot.com.

23 December 2005

IT: Computing (Just the Quotes)

"Let it be remarked [...] that an important difference between the way in which we use the brain and the machine is that the machine is intended for many successive runs, either with no reference to each other, or with a minimal, limited reference, and that it can be cleared between such runs; while the brain, in the course of nature, never even approximately clears out its past records. Thus the brain, under normal circumstances, is not the complete analogue of the computing machine but rather the analogue of a single run on such a machine." (Norbert Wiener, "Cybernetics: Or Control and Communication in the Animal and the Machine", 1948)

"There are two types of systems engineering - basis and applied. [...] Systems engineering is, obviously, the engineering of a system. It usually, but not always, includes dynamic analysis, mathematical models, simulation, linear programming, data logging, computing, optimating, etc., etc. It connotes an optimum method, realized by modern engineering techniques. Basic systems engineering includes not only the control system but also all equipment within the system, including all host equipment for the control system. Applications engineering is - and always has been - all the engineering required to apply the hardware of a hardware manufacturer to the needs of the customer. Such applications engineering may include, and always has included where needed, dynamic analysis, mathematical models, simulation, linear programming, data logging, computing, and any technique needed to meet the end purpose - the fitting of an existing line of production hardware to a customer's needs. This is applied systems engineering." (Instruments and Control Systems Vol. 31, 1958)

"The mathematical and computing techniques for making programmed decisions replace man but they do not generally simulate him." (Herbert A Simon, "Management and Corporations 1985", 1960)

"There is the very real danger that a number of problems which could profitably be subjected to analysis, and so treated by simpler and more revealing techniques. will instead be routinely shunted to the computing machines [...] The role of computing machines as a mathematical tool is not that of a panacea for all computational ills." (Richard E Bellman & Paul Brock, "On the Concepts of a Problem and Problem-Solving", American Mathematical Monthly 67, 1960)

"The purpose of computing is insight, not numbers." (Richard W Hamming, "Numerical Methods for Scientists and Engineers", 1962)

"Another thing I must point out is that you cannot prove a vague theory wrong. If the guess that you make is poorly expressed and rather vague, and the method that you use for figuring out the consequences is a little vague - you are not sure, and you say, 'I think everything's right because it's all due to so and so, and such and such do this and that more or less, and I can sort of explain how this works' […] then you see that this theory is good, because it cannot be proved wrong! Also if the process of computing the consequences is indefinite, then with a little skill any experimental results can be made to look like the expected consequences." (Richard P Feynman, "The Character of Physical Law", 1965)

"Computational reducibility may well be the exception rather than the rule: Most physical questions may be answerable only through irreducible amounts of computation. Those that concern idealized limits of infinite time, volume, or numerical precision can require arbitrarily long computations, and so be formally undecidable." (Stephen Wolfram, Undecidability and intractability in theoretical physics", Physical Review Letters 54 (8), 1985)

"We distinguish diagrammatic from sentential paper-and-pencil representations of information by developing alternative models of information-processing systems that are informationally equivalent and that can be characterized as sentential or diagrammatic. Sentential representations are sequential, like the propositions in a text. Diagrammatic representations are indexed by location in a plane. Diagrammatic representations also typically display information that is only implicit in sentential representations and that therefore has to be computed, sometimes at great cost, to make it explicit for use. We then contrast the computational efficiency of these representations for solving several. illustrative problems in mathematics and physics." (Herbert A Simon, "Why a diagram is (sometimes) worth ten thousand words", 1987)

"Neural computing is the study of cellular networks that have a natural property for storing experimental knowledge. Such systems bear a resemblance to the brain in the sense that knowledge is acquired through training rather than programming and is retained due to changes in node functions. The knowledge takes the form of stable states or cycles of states in the operation of the net. A central property of such nets is to recall these states or cycles in response to the presentation of cues." (Igor Aleksander & Helen Morton, "Neural computing architectures: the design of brain-like machines", 1989)

"Beauty is more important in computing than anywhere else in technology because software is so complicated. Beauty is the ultimate defense against complexity." (David Gelernter, "Machine Beauty: Elegance And The Heart Of Technolog", 1998)

"As systems became more varied and more complex, we find that no single methodology suffices to deal with them. This is particularly true of what may be called information intelligent systems - systems which form the core of modern technology. To conceive, design, analyze and use such systems we frequently have to employ the totality of tools that are available. Among such tools are the techniques centered on fuzzy logic, neurocomputing, evolutionary computing, probabilistic computing and related methodologies. It is this conclusion that formed the genesis of the concept of soft computing." (Lotfi A Zadeh, "The Birth and Evolution of Fuzzy Logic: A personal perspective", 1999)

"In science, it is a long-standing tradition to deal with perceptions by converting them into measurements. But what is becoming increasingly evident is that, to a much greater extent than is generally recognized, conversion of perceptions into measurements is infeasible, unrealistic or counter-productive. With the vast computational power at our command, what is becoming feasible is a counter-traditional move from measurements to perceptions. […] To be able to compute with perceptions it is necessary to have a means of representing their meaning in a way that lends itself to computation." (Lotfi A Zadeh, "The Birth and Evolution of Fuzzy Logic: A personal perspective", 1999)

"Why was progress in computing technology so fast compared with the lack of progress in space travel? The reason is very simple: computing technology is only now approaching scientific limits such as quantum uncertainty and the speed of light, while space technology has already run into its limits that derive from the basic principles of physics and chemistry." (Mordechai Ben-Ari, "Just a Theory: Exploring the Nature of Science", 2005)

"Granular computing is a general computation theory for using granules such as subsets, classes, objects, clusters, and elements of a universe to build an efficient computational model for complex applications with huge amounts of data, information, and knowledge. Granulation of an object a leads to a collection of granules, with a granule being a clump of points (objects) drawn together by indiscernibility, similarity, proximity, or functionality. In human reasoning and concept formulation, the granules and the values of their attributes are fuzzy rather than crisp. In this perspective, fuzzy information granulation may be viewed as a mode of generalization, which can be applied to any concept, method, or theory." (Salvatore Greco et al, "Granular Computing and Data Mining for Ordered Data: The Dominance-Based Rough Set Approach", 2009)

15 December 2005

IT: Peer-to-Peer Network (Definitions)

[peer-to-peer computing:] "Users loosely connected through online connections that enable them to share data and programs." (Greg Perry, "Sams Teach Yourself Beginning Programming in 24 Hours 2nd Ed.", 2001)

[peer-to-peer computing:] "A distributed computing model in which each node has equal standing among the collection of nodes. In the most typical usage of this term, the same capabilities are offered by each node, and any node can initiate a communication session with another node. This contrasts with, for example, client-server computing. The capabilities that are shared in peer-to-peer computing include file-sharing as well as computation." (Beverly A Sanders, "Patterns for Parallel Programming", 2004)

"A network comprised of individual participants that have equal capabilities and duties." (Andy Walker, "Absolute Beginner’s Guide To: Security, Spam, Spyware & Viruses", 2005)

"A blanket term used to describe: (1) a peer-centric distributed software architecture, (2) a flavor of software that encourages collaboration and file sharing between peers, and (3) a cultural progression in the way humans and applications interact with each other that emphasizes two way interactive 'conversations' in place of the Web’s initial television-like communication model (where information only flows in one direction)." (Craig F Smith & H Peter Alesso, "Thinking on the Web: Berners-Lee, Gödel and Turing", 2008)

"A networking system in which nodes in a network exchange data directly instead of going through a central server. " (Marcia Kaufman et al, "Big Data For Dummies", 2013)

"A network where all computers can both share and acces resources from other computers on the same network; a decentralized network." (Faithe Wempen, "Computing Fundamentals: Introduction to Computers", 2015)

"A type of network in which a group of personal computers is interconnected so that the hard disks, CD ROMs, files, and printers of each computer can be accessed from every other computer on the network. Peer-to-peer networks do not have a central file server. This type of system is used if less than a dozen computers will be networked." (James R Kalyvas & Michael R Overly, "Big Data: A Businessand Legal Guide", 2015)

"A decentralized network where participants have equal privileges and make certain resources directly available to other network participants." (AICPA)

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.