30 December 2014

🕸Systems Engineering: Information Theory (Just the Quotes)

"[…] information theory is characterised essentially by its dealing always with a set of possibilities; both its primary data and its final statements are almost always about the set as such, and not about some individual element in the set." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"The general notion in communication theory is that of information. In many cases, the flow of information corresponds to a flow of energy, e. g. if light waves emitted by some objects reach the eye or a photoelectric cell, elicit some reaction of the organism or some machinery, and thus convey information." (Ludwig von Bertalanffy, "General System Theory", 1968) 

"The 'flow of information' through human communication channels is enormous. So far no theory exists, to our knowledge, which attributes any sort of unambiguous measure to this 'flow'." (Anatol Rapoport, "Modern Systems Research for the Behavioral Scientist", 1969)

"Probability plays a central role in many fields, from quantum mechanics to information theory, and even older fields use probability now that the presence of 'noise' is officially admitted. The newer aspects of many fields start with the admission of uncertainty." (Richard Hamming, "Methods of Mathematics Applied to Calculus, Probability, and Statistics", 1985)

"The field of 'information theory' began by using the old hardware paradigm of transportation of data from point to point." (Marshall McLuhan & Eric McLuhan, Laws of Media: The New Science, 1988)

"Without an understanding of causality there can be no theory of communication. What passes as information theory today is not communication at all, but merely transportation." (Marshall McLuhan & Eric McLuhan, "Laws of Media: The New Science", 1988)

"If quantum communication and quantum computation are to flourish, a new information theory will have to be developed." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"In fact, an information theory that leaves out the issue of noise turns out to have no content." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"In an information economy, entrepreneurs master the science of information in order to overcome the laws of the purely physical sciences. They can succeed because of the surprising power of the laws of information, which are conducive to human creativity. The central concept of information theory is a measure of freedom of choice. The principle of matter, on the other hand, is not liberty but limitation - it has weight and occupies space." (George Gilder, "Knowledge and Power: The Information Theory of Capitalism and How it is Revolutionizing our World", 2013)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

No comments:

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.