07 February 2014

🕸Systems Engineering: Entropy (Definitions)

"The Entropy of a system is the mechanical work it can perform without communication of heat, or alteration of its total volume, all transference of heat being performed by reversible engines." (James C Maxwell, "Theory of Heat", 1899)

"Entropy is the measure of randomness." (Lincoln Barnett, "The Universe and Dr. Einstein", 1948)

"Entropy is a measure of the heat energy in a substance that has been lost and is no longer available for work. It is a measure of the deterioration of a system." (William B. Sill & Norman Hoss (Eds.), "Popular Science Encyclopedia of the Sciences", 1963)

"Entropy [...] is the amount of disorder or randomness present in any system." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"A measurement of the disorder of a data set." (Glenn J Myatt, "Making Sense of Data: A Practical Guide to Exploratory Data Analysis and Data Mining", 2006)

"[...] entropy is the amount of hidden microscopic information." (Leonard Susskind, "The Black Hole War", 2008)

"A measure of the uncertainty associated with a random variable. Entropy quantifies information in a piece of data." (Radu Mutihac, "Bayesian Neural Networks for Image Restoration" [in "Encyclopedia of Artificial Intelligence", 2009)

"Measurement that can be used in machine learning on a set of data that is to be classified. In this setting it can be defined as the amount of uncertainty or randomness (or noise) in the data. If all data is classified with the same class, the entropy of that set would be 0." (Isak Taksa et al, "Machine Learning Approach to Search Query Classification", 2009)

"A measure of uncertainty associated with the predictable value of information content. The highest information entropy is when the ambiguity or uncertainty of the outcome is the greatest." (Alex Berson & Lawrence Dubov, "Master Data Management and Data Governance", 2010)

"Refers to the inherent unknowability of data to external observers. If a bit is just as likely to be a 1 as a 0 and a user does not know which it is, then the bit contains 1 bit of entropy." (Mark S Merkow & Lakshmikanth Raghavan, "Secure and Resilient Software Development", 2010)

"The measurement of uncertainty in an outcome, or randomness in a system." (DAMA International, "The DAMA Dictionary of Data Management", 2011)

"A metric used to evaluate and describe the amount of randomness associated with a random variable."(Wenbing Zhao, "Increasing the Trustworthiness of Online Gaming Applications", 2015)

"Anti-entropy is the process of detecting differences in replicas. From a performance perspective, it is important to detect and resolve inconsistencies with a minimum amount of data exchange." (Dan Sullivan, "NoSQL for Mere Mortals®", 2015)

"Average amount of information contained in a sample drawn from a distribution or data stream. Measure of uncertainty of the source of information." (Anwesha Sengupta et al, "Alertness Monitoring System for Vehicle Drivers using Physiological Signals", 2016)

"In information theory this notion, introduced by Claude Shannon, is used to express unpredictability of information content. For instance, if a data set containing n items was divided into k groups each comprising n i items, then the entropy of such a partition is H = p 1 log( p 1 ) + … + p k log( p k ), where p i = n i / n . In case of two alternative partitions, the mutual information is a measure of the mutual dependence between these partitions." (Slawomir T Wierzchon, "Ensemble Clustering Data Mining and Databases", 2018) [where i is used as index]

"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution." ("G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)

"Lack of order or predictability; gradual decline into disorder." (Adrian Carballal et al, "Approach to Minimize Bias on Aesthetic Image Datasets", 2019)

"It is the quantity which is used to describe the amount of information which must be coded for compression algorithm." (Arockia Sukanya & Kamalanand Krishnamurthy, "Thresholding Techniques for Dental Radiographic Images: A Comparative Study", 2019)

"In the physics - rate of system´s messiness or disorder in a physical system. In the social systems theory - social entropy is a sociological theory that evaluates social behaviors using a method based on the second law of thermodynamics." (Justína Mikulášková et al, "Spiral Management: New Concept of the Social Systems Management", 2020)

No comments:

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.