06 April 2006

🖍️Nate Silver - Collected Quotes

 "A forecaster should almost never ignore data, especially when she is studying rare events […]. Ignoring data is often a tip-off that the forecaster is overconfident, or is overfitting her model - that she is interested in showing off rather than trying to be accurate."  (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Complex systems seem to have this property, with large periods of apparent stasis marked by sudden and catastrophic failures. These processes may not literally be random, but they are so irreducibly complex (right down to the last grain of sand) that it just won’t be possible to predict them beyond a certain level. […] And yet complex processes produce order and beauty when you zoom out and look at them from enough distance." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Data-driven predictions can succeed - and they can fail. It is when we deny our role in the process that the odds of failure rise. Before we demand more of our data, we need to demand more of ourselves." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Distinguishing the signal from the noise requires both scientific knowledge and self-knowledge." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The instinctual shortcut that we take when we have 'too much information' is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies of the rest." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The most basic tenet of chaos theory is that a small change in initial conditions - a butterfly flapping its wings in Brazil - can produce a large and unexpected divergence in outcomes - a tornado in Texas. This does not mean that the behavior of the system is random, as the term 'chaos' might seem to imply. Nor is chaos theory some modern recitation of Murphy’s Law ('whatever can go wrong will go wrong'). It just means that certain types of systems are very hard to predict." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The numbers have no way of speaking for themselves. We speak for them. We imbue them with meaning." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The signal is the truth. The noise is what distracts us from the truth." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The systems are dynamic, meaning that the behavior of the system at one point in time influences its behavior in the future; And they are nonlinear, meaning they abide by exponential rather than additive relationships." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"We forget - or we willfully ignore - that our models are simplifications of the world. We figure that if we make a mistake, it will be at the margin. In complex systems, however, mistakes are not measured in degrees but in whole orders of magnitude." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"We need to stop, and admit it: we have a prediction problem. We love to predict things - and we aren't very good at it." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Whether information comes in a quantitative or qualitative flavor is not as important as how you use it. [...] The key to making a good forecast […] is not in limiting yourself to quantitative information. Rather, it’s having a good process for weighing the information appropriately. […] collect as much information as possible, but then be as rigorous and disciplined as possible when analyzing it. [...] Many times, in fact, it is possible to translate qualitative information into quantitative information." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Statistics is the science of finding relationships and actionable insights from data." (Nate Silver)

🖍️Beau Lotto - Collected Quotes

"Effects without an understanding of the causes behind them, on the other hand, are just bunches of data points floating in the ether, offering nothing useful by themselves. Big Data is information, equivalent to the patterns of light that fall onto the eye. Big Data is like the history of stimuli that our eyes have responded to. And as we discussed earlier, stimuli are themselves meaningless because they could mean anything. The same is true for Big Data, unless something transformative is brought to all those data sets… understanding." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

"New information is constantly flowing in, and your brain is constantly integrating it into this statistical distribution that creates your next perception (so in this sense 'reality' is just the product of your brain’s ever-evolving database of consequence). As such, your perception is subject to a statistical phenomenon known in probability theory as kurtosis. Kurtosis in essence means that things tend to become increasingly steep in their distribution [...] that is, skewed in one direction. This applies to ways of seeing everything from current events to ourselves as we lean 'skewedly' toward one interpretation, positive or negative. Things that are highly kurtotic, or skewed, are hard to shift away from. This is another way of saying that seeing differently isn’t just conceptually difficult - it’s statistically difficult." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

"Our assumptions are un question ably interconnected. They are nodes with connections (edges) to other nodes. The more foundational the assumption, the more strongly connected it is. What I’m suggesting is that our assumptions and the highly sensitive network of responses, perceptions, behaviors, thoughts, and ideas they create and interact with are a complex system. One of the most basic features of such a network is that when you move or disrupt one thing that is strongly connected, you don’t just affect that one thing, you affect all the other things that are connected to it. Hence small causes can have massive effects (but they don’t have to, and usually don’t actually). In a system of high tension, simple questions targeting basic assumptions have the potential to transform perception in radical  and unpredictable ways." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

"Questioning our assumptions is what provokes revolutions, be they tiny or vast, technological or social." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

"Understanding reduces the complexity of data by collapsing the dimensionality of information to a lower set of known variables. s revolutions, be they tiny or vast, technological or social." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

"The basis of complex systems is actually quite simple (and this is not an attempt to be paradoxical, like an art critic who describes a sculpture as 'big yet small'). What makes a system unpredictable and thus nonlinear (which includes you and your perceptual process, or the process of making collective decisions) is that the components making up the system are interconnected." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

"The greatest leaders possess a combination of divergent traits: they are both experts and naïve, creative and efficient, serious and playful, social and reclusive - or at the very least, they surround themselves with this dynamic." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017) 

"The term [Big Data] simply refers to sets of data so immense that they require new methods of mathematical analysis, and numerous servers. Big Data - and, more accurately, the capacity to collect it - has changed the way companies conduct business and governments look at problems, since the belief wildly trumpeted in the media is that this vast repository of information will yield deep insights that were previously out of reach." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

"Trust is fundamental to leading others into the dark, since trust enables fear to be 'actionable' as courage rather than actionable as anger. Since the bedrock of trust is faith that all will be OK within uncertainty, leaders’ fundamental role is to ultimately lead themselves. Research has found that successful leaders share three behavioral traits: they lead by example, admit their mistakes, and see positive qualities in others. All three are linked to spaces of play. Leading by example creates a space that is trusted - and without trust, there is no play. Admitting mistakes is to celebrate uncertainty. Seeing qualities in others is to encourage diversity." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

"Understanding transcends context, since the different contexts collapse according to their previously unknown similarity, which the principle contains. That is what understanding does. And you actually feel it in your brain when it happens. Your 'cognitive load' decreases, your level of stress and anxiety decrease, and your emotional state improves." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

"What defines a good leader? Enabling other people to step into the unseen. […] as the world becomes increasingly connected and thus unpredictable, the concept of leadership too must change. Rather than lead from the front toward efficiency, offering the answers, a good leader is defined by how he or she leads others into darkness - into uncertainty." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

05 April 2006

🖍️Robert Hooke - Collected Quotes

"Accounting figures are a blend of facts and arbitrary procedures that are designed to facilitate the recording and communication of business transactions. Their usefulness in the decision process is sometimes grossly overestimated." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"All of us learn by experience. Except for pure deductive processes, everything we learn is from someone's experience. All experience is a sample from an immense range of possible experience that no one individual can ever take in. It behooves us to know what parts of the information we get from samples can be trusted and what cannot." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"Being experimental, however, doesn't necessarily make a scientific study entirely credible. One weakness of experimental work is that it can be out of touch with reality when its controls are so rigid that conclusions are valid only in the experimental situation and don't carryover into the real world." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"Correlation analysis is a useful tool for uncovering a tenuous relationship, but it doesn't necessarily provide any real understanding of the relationship, and it certainly doesn't provide any evidence that the relationship is one of cause and effect. People who don't understand correlation tend to credit it with being a more fundamental approach than it is." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"Experiments usually are looking for 'signals' of truth, and the search is always ham pered by 'noise' of one kind or another. In judging someone else's experimental results it's important to find out whether they represent a true signal or whether they are just so much noise." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

 "First and foremost an experiment should have a goal, and the goal should be something worth achieving, especially if the experimenter is working on someone else's (for example, the taxpayers') money. 'Worth achieving' implies more than just beneficial; it also should mean that the experiment is the most beneficial thing we can think of doing. Obviously we can't predict accurately the value of an experiment (this may not even be possible after we see how it turns out), but we should feel obliged to make as intelligent a choice as we can. Such a choice is sometimes labeled a 'value judgment'." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"In general a small-scale test or experiment will not detect a small effect, or small differences among various products." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"Mistakes arising from retrospective data analysis led to the idea of experimentation, and experience with experimentation led to the idea of controlled experiments and then to the proper design of experiments for efficiency and credibility. When someone is pushing a conclusion at you, it's a good idea to ask where it came from - was there an experiment, and if so, was it controlled and was it relevant?" (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"One important way of developing our powers of discrimination between good and bad statistical studies is to learn about the differences between backward-looking (retrospective or historical) data and data obtained through carefully planned and controlled (forward-looking) experiments." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"Only a 0 correlation is uninteresting, and in practice 0 correlations do not occur. When you stuff a bunch of numbers into the correlation formula, the chance of getting exactly 0, even if no correlation is truly present, is about the same as the chance of a tossed coin ending up on edge instead of heads or tails.(Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"Randomization is usually a cheap and harmless way of improving the effectiveness of experimentation with very little extra effort." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"Science usually amounts to a lot more than blind trial and error. Good statistics consists of much more than just significance tests; there are more sophisticated tools available for the analysis of results, such as confidence statements, multiple comparisons, and Bayesian analysis, to drop a few names. However, not all scientists are good statisticians, or want to be, and not all people who are called scientists by the media deserve to be so described." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"Statistical reasoning is such a fundamental part of experimental science that the study of principles of data analysis has become a vital part of the scientist's education. Furthermore, […] the existence of a lot of data does not necessarily mean that any useful information is there ready to be extracted." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"The idea of statistical significance is valuable because it often keeps us from announcing results that later turn out to be nonresults. A significant result tells us that enough cases were observed to provide reasonable assurance of a real effect. It does not necessarily mean, though, that the effect is big enough to be important." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"Today's scientific investigations are so complicated that even experts in related fields may not understand them well. But there is a logic in the planning of experiments and in the analysis of their results that all intelligent people can grasp, and this logic is a great help in determining when to believe what we hear and read and when to be skeptical. This logic has a great deal to do with statistics, which is why statisticians have a unique interest in the scientific method, and why some knowledge of statistics can so often be brought to bear in distinguishing good arguments from bad ones." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"When a real situation involves chance we have to use probability mathematics to understand it quantitatively. Direct mathematical solutions sometimes exist […] but most real systems are too complicated for direct solutions. In these cases the computer, once taught to generate random numbers, can use simulation to get useful answers to otherwise impossible problems." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

🖍️Mike Loukides - Collected Quotes

"Data is frequently missing or incongruous. If data is missing, do you simply ignore the missing points? That isn’t always possible. If data is incongruous, do you decide that something is wrong with badly behaved data (after all, equipment fails), or that the incongruous data is telling its own story, which may be more interesting? It’s reported that the discovery of ozone layer depletion was delayed because automated data collection tools discarded readings that were too low. In data science, what you have is frequently all you’re going to get. It’s usually impossible to get 'better' data, and you have no alternative but to work with the data at hand." (Mike Loukides, "What Is Data Science?", 2011).

"Data science isn’t just about the existence of data, or making guesses about what that data might mean; it’s about testing hypotheses and making sure that the conclusions you’re drawing from the data are valid." (Mike Loukides, "What Is Data Science?", 2011)

"Data scientists combine entrepreneurship with patience, the willingness to build data products incrementally, the ability to explore, and the ability to iterate over a solution. They are inherently interdisciplinary. They can tackle all aspects of a problem, from initial data collection and data conditioning to drawing conclusions. They can think outside the box to come up with new ways to view the problem, or to work with very broadly defined problems: 'there’s a lot of data, what can you make from it?'" (Mike Loukides, "What Is Data Science?", 2011)

"Discovery is the key to building great data products, as opposed to products that are merely good." (Mike Loukides, "The Evolution of Data Products", 2011)

"New interfaces for data products are all about hiding the data itself, and getting to what the user wants." (Mike Loukides, "The Evolution of Data Products", 2011)

"The thread that ties most of these applications together is that data collected from users provides added value. Whether that data is search terms, voice samples, or product reviews, the users are in a feedback loop in which they contribute to the products they use. That’s the beginning of data science." (Mike Loukides, "What Is Data Science?", 2011)

"Using data effectively requires something different from traditional statistics, where actuaries in business suits perform arcane but fairly well-defined kinds of analysis. What differentiates data science from statistics is that data science is a holistic approach. We’re increasingly finding data in the wild, and data scientists are involved with gathering data, massaging it into a tractable form, making it tell its story, and presenting that story to others" (Mike Loukides, "What Is Data Science?", 2011).

"Whether we’re talking about web server logs, tweet streams, online transaction records, 'citizen science', data from sensors, government data, or some other source, the problem isn’t finding data, it’s figuring out what to do with it." (Mike Loukides, "What Is Data Science?", 2011)

🖍️Charles Wheelan - Collected Quotes

"A statistical index has all the potential pitfalls of any descriptive statistic - plus the distortions introduced by combining multiple indicators into a single number. By definition, any index is going to be sensitive to how it is constructed; it will be affected both by what measures go into the index and by how each of those measures is weighted." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"Correlation measures the degree to which two phenomena are related to one another. [...] Two variables are positively correlated if a change in one is associated with a change in the other in the same direction, such as the relationship between height and weight. [...] A correlation is negative if a positive change in one variable is associated with a negative change in the other, such as the relationship between exercise and weight." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"Descriptive statistics give us insight into phenomena that we care about. […] Although the field of statistics is rooted in mathematics, and mathematics is exact, the use of statistics to describe complex phenomena is not exact. That leaves plenty of room for shading the truth." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"Even if you have a solid indicator of what you are trying to measure and manage, the challenges are not over. The good news is that 'managing by statistics' can change the underlying behavior of the person or institution being managed for the better. If you can measure the proportion of defective products coming off an assembly line, and if those defects are a function of things happening at the plant, then some kind of bonus for workers that is tied to a reduction in defective products would presumably change behavior in the right kinds of ways. Each of us responds to incentives (even if it is just praise or a better parking spot). Statistics measure the outcomes that matter; incentives give us a reason to improve those outcomes." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"Even in the best of circumstances, statistical analysis rarely unveils “the truth.” We are usually building a circumstantial case based on imperfect data. As a result, there are numerous reasons that intellectually honest individuals may disagree about statistical results or their implications. At the most basic level, we may disagree on the question that is being answered." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"If the distance from the mean for one variable tends to be broadly consistent with distance from the mean for the other variable (e.g., people who are far from the mean for height in either direction tend also to be far from the mean in the same direction for weight), then we would expect a strong positive correlation. If distance from the mean for one variable tends to correspond to a similar distance from the mean for the second variable in the other direction (e.g., people who are far above the mean in terms of exercise tend to be far below the mean in terms of weight), then we would expect a strong negative correlation. If two variables do not tend to deviate from the mean in any meaningful pattern (e.g., shoe size and exercise) then we would expect little or no correlation." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"Once these different measures of performance are consolidated into a single number, that statistic can be used to make comparisons […] The advantage of any index is that it consolidates lots of complex information into a single number. We can then rank things that otherwise defy simple comparison […] Any index is highly sensitive to the descriptive statistics that are cobbled together to build it, and to the weight given to each of those components. As a result, indices range from useful but imperfect tools to complete charades." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"Probability is the study of events and outcomes involving an element of uncertainty." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"Regression analysis, like all forms of statistical inference, is designed to offer us insights into the world around us. We seek patterns that will hold true for the larger population. However, our results are valid only for a population that is similar to the sample on which the analysis has been done." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"Statistics cannot be any smarter than the people who use them. And in some cases, they can make smart people do dumb things." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"The correlation coefficient has two fabulously attractive characteristics. First, for math reasons that have been relegated to the appendix, it is a single number ranging from –1 to 1. A correlation of 1, often described as perfect correlation, means that every change in one variable is associated with an equivalent change in the other variable in the same direction. A correlation of –1, or perfect negative correlation, means that every change in one variable is associated with an equivalent change in the other variable in the opposite direction. The closer the correlation is to 1 or –1, the stronger the association. […] The second attractive feature of the correlation coefficient is that it has no units attached to it. […] The correlation coefficient does a seemingly miraculous thing: It collapses a complex mess of data measured in different units (like our scatter plots of height and weight) into a single, elegant descriptive statistic." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"The problem is that the mechanics of regression analysis are not the hard part; the hard part is determining which variables ought to be considered in the analysis and how that can best be done. Regression analysis is like one of those fancy power tools. It is relatively easy to use, but hard to use well - and potentially dangerous when used improperly." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"There are limits on the data we can gather and the kinds of experiments we can perform."(Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"While the main point of statistics is to present a meaningful picture of things we care about, in many cases we also hope to act on these numbers." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

04 April 2006

🖍️Brian Godsey - Collected Quotes

"A good software developer (or engineer) and a good data scientist have several traits in common. Both are good at designing and building complex systems with many interconnected parts; both are familiar with many different tools and frameworks for building these systems; both are adept at foreseeing potential problems in those systems before they’re actualized. But in general, software developers design systems consisting of many well-defined components, whereas data scientists work with systems wherein at least one of the components isn’t well defined prior to being built, and that component is usually closely involved with data processing or analysis." (Brian Godsey, "Think Like a Data Scientist", 2017)

"A notable difference between many fields and data science is that in data science, if a customer has a wish, even an experienced data scientist may not know whether it’s possible. Whereas a software engineer usually knows what tasks software tools are capable of performing, and a biologist knows more or less what the laboratory can do, a data scientist who has not yet seen or worked with the relevant data is faced with a large amount of uncertainty, principally about what specific data is available and about how much evidence it can provide to answer any given question. Uncertainty is, again, a major factor in the data scientific process and should be kept at the forefront of your mind when talking with customers about their wishes."  (Brian Godsey, "Think Like a Data Scientist", 2017)

"The process of data science begins with preparation. You need to establish what you know, what you have, what you can get, where you are, and where you would like to be. This last one is of utmost importance; a project in data science needs to have a purpose and corresponding goals. Only when you have well-defined goals can you begin to survey the available resources and all the possibilities for moving toward those goals." (Brian Godsey, "Think Like a Data Scientist", 2017)

"Uncertainty is an adversary of coldly logical algorithms, and being aware of how those algorithms might break down in unusual circumstances expedites the process of fixing problems when they occur - and they will occur. A data scientist’s main responsibility is to try to imagine all of the possibilities, address the ones that matter, and reevaluate them all as successes and failures happen." (Brian Godsey, "Think Like a Data Scientist", 2017)

🖍️Ely Devons - Collected Quotes

"Every economic and social situation or problem is now described in statistical terms, and we feel that it is such statistics which give us the real basis of fact for understanding and analysing problems and difficulties, and for suggesting remedies. In the main we use such statistics or figures without any elaborate theoretical analysis; little beyond totals, simple averages and perhaps index numbers. Figures have become the language in which we describe our economy or particular parts of it, and the language in which we argue about policy." (Ely Devons, "Essays in Economics", 1961)

"Indeed the language of statistics is rarely as objective as we imagine. The way statistics are presented, their arrangement in a particular way in tables, the juxtaposition of sets of figures, in itself reflects the judgment of the author about what is significant and what is trivial in the situation which the statistics portray." (Ely Devons, "Essays in Economics", 1961)

"It might be reasonable to expect that the more we know about any set of statistics, the greater the confidence we would have in using them, since we would know in which directions they were defective; and that the less we know about a set of figures, the more timid and hesitant we would be in using them. But, in fact, it is the exact opposite which is normally the case; in this field, as in many others, knowledge leads to caution and hesitation, it is ignorance that gives confidence and boldness. For knowledge about any set of statistics reveals the possibility of error at every stage of the statistical process; the difficulty of getting complete coverage in the returns, the difficulty of framing answers precisely and unequivocally, doubts about the reliability of the answers, arbitrary decisions about classification, the roughness of some of the estimates that are made before publishing the final results. Knowledge of all this, and much else, in detail, about any set of figures makes one hesitant and cautious, perhaps even timid, in using them." (Ely Devons, "Essays in Economics", 1961)

"The art of using the language of figures correctly is not to be over-impressed by the apparent air of accuracy, and yet to be able to take account of error and inaccuracy in such a way as to know when, and when not, to use the figures. This is a matter of skill, judgment, and experience, and there are no rules and short cuts in acquiring this expertness." (Ely Devons, "Essays in Economics", 1961)

"The knowledge that the economist uses in analysing economic problems and in giving advice on them is of thre First, theories of how the economic system works (and why it sometimes does not work so well); second, commonsense maxims about reasonable economic behaviour; and third, knowledge of the facts describing the main features of the economy, many of these facts being statistical." (Ely Devons, "Essays in Economics", 1961)

"The general models, even of the most elaborate kind, serve the simple purpose of demonstrating the interconnectedness of all economic phenomena, and show how, under certain conditions, price may act as a guiding link between them. Looked at in another way such models show how a complex set of interrelations can hang together consistently without any central administrative direction." (Ely Devons, "Essays in Economics", 1961)

"The most important and frequently stressed prescription for avoiding pitfalls in the use of economic statistics, is that one should find out before using any set of published statistics, how they have been collected, analysed and tabulated. This is especially important, as you know, when the statistics arise not from a special statistical enquiry, but are a by-product of law or administration. Only in this way can one be sure of discovering what exactly it is that the figures measure, avoid comparing the non-comparable, take account of changes in definition and coverage, and as a consequence not be misled into mistaken interpretations and analysis of the events which the statistics portray." (Ely Devons, "Essays in Economics", 1961)

 "The two most important characteristics of the language of statistics are first, that it describes things in quantitative terms, and second, that it gives this description an air of accuracy and precision. The limitations, as well as the advantages, of the statistical approach arise from these two characteristics. For a description of the quantitative aspect of events never gives us the whole story; and even the best statistics are never, and never can be, completely accurate and precise. To avoid misuse of the language we must, therefore, guard against exaggerating the importance of the elements in any situation that can be described quantitatively, and we must know sufficient about the error and inaccuracy of the figures to be able to use them with discretion." (Ely Devons, "Essays in Economics", 1961)

"There are, indeed, plenty of ways in which statistics can help in the process of decision-taking. But exaggerated claims for the role they can play merely serve to confuse rather than clarify issues of public policy, and lead those responsible for action to oscillate between over-confidence and over-scepticism in using them." (Ely Devons, "Essays in Economics", 1961)

"There is a demand for every issue of economic policy to be discussed in terms of statistics, and even those who profess a general distrust of statistics are usually more impressed by an argument in support of a particular policy if it is backed up by figures. There is a passionate desire in our society to see issues of economic policy decided on what we think are rational grounds. We rebel against any admission of the uncertainty of our knowledge of the future as a confession of weakness." (Ely Devons, "Essays in Economics", 1961)

"There seems to be striking similarities between the role of economic statistics in our society and some of the functions which magic and divination play in primitive society." (Ely Devons, "Essays in Economics", 1961)

"This exaggerated influence of statistics resulting from willingness, indeed eagerness, to be impressed by the 'hard facts' provided by the 'figures', may play an important role in decision-making." (Ely Devons, "Essays in Economics", 1961)

"We all know that in economic statistics particularly, true precision, comparability and accuracy is extremely difficult to achieve, and it is for this reason that the language of economic statistics is so difficult to handle." (Ely Devons, "Essays in Economics", 1961)

03 April 2006

🖍️Kristin H Jarman - Collected Quotes

"A study is any data collection exercise. The purpose of any study is to answer a question. [...] Once the question has been clearly articulated, it’s time to design a study to answer it. At one end of the spectrum, a study can be a controlled experiment, deliberate and structured, where researchers act like the ultimate control freaks, manipulating everything from the gender of their test subjects to the humidity in the room. Scientific studies, the kind run by men in white lab coats and safety goggles, are often controlled experiments. At the other end of the spectrum, an observational study is simply the process of watching something unfold without trying to impact the outcome in any way." (Kristin H Jarman, "The Art of Data Analysis: How to answer almost any question using basic statistics", 2013)

"According to the central limit theorem, it doesn’t matter what the raw data look like, the sample variance should be proportional to the number of observations and if I have enough of them, the sample mean should be normal." (Kristin H Jarman, "The Art of Data Analysis: How to answer almost any question using basic statistics", 2013)

"Although it’s a little more complicated than [replication and random sampling], blocking is a powerful way to eliminate confounding factors. Blocking is the process of dividing a sample into one or more similar groups, or blocks, so that samples in each block have certain factors in common. This technique is a great way to gain a little control over an experiment with lots of uncontrollable factors." (Kristin H Jarman, "The Art of Data Analysis: How to answer almost any question using basic statistics", 2013)

"Any factor you don’t account for can become a confounding factor. A confounding factor is any variable that confuses the conclusions of your study, or makes them ambiguous. [...] Confounding factors can really screw up an otherwise perfectly good statistical analysis." (Kristin H Jarman, "The Art of Data Analysis: How to answer almost any question using basic statistics", 2013)

"Any time you collect data, you have uncertainty to deal with. This uncertainty comes from two places: (1) inherent variation in the values a random variable can take on and (2) the fact that for most studies, you can’t capture the entire population and so you must rely on a sample to make your conclusions." (Kristin H Jarman, "The Art of Data Analysis: How to answer almost any question using basic statistics", 2013)

"Choosing and organizing a sample is a crucial part of the experimental design process. Statistically speaking, the best type of sample is called a random sample. A random sample is a subset of the entire population, chosen so each member is equally likely to be picked. [...] Random sampling is the best way to guarantee you’ve chosen objectively, without personal preference or bias." (Kristin H Jarman, "The Art of Data Analysis: How to answer almost any question using basic statistics", 2013)

"Probability, the mathematical language of uncertainty, describes what are called random experiments, bets, campaigns, trials, games, brawls, and anything other situation where the outcome isn’t known beforehand. A probability is a fraction, a value between zero and one that measures the likelihood a given outcome will occur. A probability of zero means the outcome is virtually impossible. A probability of one means it will almost certainly happen. A probability of one-half means the outcome is just as likely to occur as not." (Kristin H Jarman, "The Art of Data Analysis: How to answer almost any question using basic statistics", 2013)

"Replication is the process of taking more than one observation or measurement. [...] Replication helps eliminate negative effects of uncontrollable factors, because it keeps us from getting fooled by a single, unusual outcome." (Kristin H Jarman, "The Art of Data Analysis: How to answer almost any question using basic statistics", 2013)

"The random experiment, or trial, is the situation whose outcome is uncertain, the one you’re watching. A coin toss is a random experiment, because you don’t know beforehand whether it will turn up heads or tails. The sample space is the list of all possible separate and distinct outcomes in your random experiment. The sample space in a coin toss contains the two outcomes heads and tails. The outcome you're interested in calculating a probability for is the event. On a coin toss, that might be the case where the coin lands on heads." (Kristin H Jarman, "The Art of Data Analysis: How to answer almost any question using basic statistics", 2013)

"The scientific method is the foundation of modern research. It’s how we prove a theory. It’s how we demonstrate cause and effect. It’s how we discover, innovate, and invent. There are five basic steps to the scientific method: (1) Ask a question. (2) Conduct background research. (3) Come up with a hypothesis. (4) Test the hypothesis with data. (5) Revise and retest the hypothesis until a conclusion can be made." (Kristin H Jarman, "The Art of Data Analysis: How to answer almost any question using basic statistics", 2013)

"There are three important requirements for the probability distribution. First, it should be defined for every possible value the random variable can take on. In other words, it should completely describe the sample space of a random experiment. Second, the probability distribution values should always be nonnegative. They’re meant to measure probabilities, after all, and probabilities are never less than zero. Finally, when all the probability distribution values are summed together, they must add to one." (Kristin H Jarman, "The Art of Data Analysis: How to answer almost any question using basic statistics", 2013)

OOP: Attribute (Definitions)

"Additional characteristics or information defined for an entity." (Owen Williams, "MCSE TestPrep: SQL Server 6.5 Design and Implementation", 1998)

"A named characteristic or property of a class." (Craig Larman, "Applying UML and Patterns", 2004)

"A characteristic, quality, or property of an entity class. For example, the properties 'First Name' and 'Last Name' are attributes of entity class 'Person'." (Danette McGilvray, "Executing Data Quality Projects", 2008)

"Another name for a field, used by convention in many object-oriented programming languages. Scala follows Java’s convention of preferring the term field over attribute." (Dean Wampler & Alex Payne, "Programming Scala", 2009)

"1. (UML diagram) A descriptor of a kind of information captured about an object class. 2. (Relational theory) The definition of a descriptor of a relation." (David C Hay, "Data Model Patterns: A Metadata Map", 2010)

"A fact type element (specifically a characteristic assignment) that is a descriptor of an entity class." (David C Hay, "Data Model Patterns: A Metadata Map", 2010)

"A characteristic of an object." (Requirements Engineering Qualifications Board, "Standard glossary of terms used in Requirements Engineering", 2011)

"An inherent characteristic, an accidental quality, an object closely associated with or belonging to a specific person, place, or office; a word ascribing a quality." (DAMA International, "The DAMA Dictionary of Data Management", 2011)

02 April 2006

🖍️Prashant Natarajan - Collected Quotes

"Data quality in warehousing and BI is typically defined in terms of the 4 C’s—is the data clean, correct, consistent, and complete? When it comes to big data, there are two schools of thought that have different views and expectations of data quality. The first school believes that the gold standard of the 4 C’s must apply to all data (big and little) used for clinical care and performance metrics. The second school believes that in big data environments, a stringent data quality standard is impossible, too costly, or not required. While diametrically opposite opinions may play well in panel discussions, they do little to reconcile the realities of healthcare data quality." (Prashant Natarajan et al, "Demystifying Big Data and Machine Learning for Healthcare", 2017) 

"Data warehousing has always been difficult, because leaders within an organization want to approach warehousing and analytics as just another technology or application buy. Viewed in this light, they fail to understand the complexity and interdependent nature of building an enterprise reporting environment." (Prashant Natarajan et al, "Demystifying Big Data and Machine Learning for Healthcare", 2017)

"Generalization is a core concept in machine learning; to be useful, machine-learning algorithms can’t just memorize the past, they must learn from the past. Generalization is the ability to respond properly to new situations based on experience from past situations." (Prashant Natarajan et al, "Demystifying Big Data and Machine Learning for Healthcare", 2017)

"The field of big-data analytics is still littered with a few myths and evidence-free lore. The reasons for these myths are simple: the emerging nature of technologies, the lack of common definitions, and the non-availability of validated best practices. Whatever the reasons, these myths must be debunked, as allowing them to persist usually has a negative impact on success factors and Return on Investment (RoI). On a positive note, debunking the myths allows us to set the right expectations, allocate appropriate resources, redefine business processes, and achieve individual/organizational buy-in." (Prashant Natarajan et al, "Demystifying Big Data and Machine Learning for Healthcare", 2017) 

"The first myth is that prediction is always based on time-series extrapolation into the future (also known as forecasting). This is not the case: predictive analytics can be applied to generate any type of unknown data, including past and present. In addition, prediction can be applied to non-temporal (time-based) use cases such as disease progression modeling, human relationship modeling, and sentiment analysis for medication adherence, etc. The second myth is that predictive analytics is a guarantor of what will happen in the future. This also is not the case: predictive analytics, due to the nature of the insights they create, are probabilistic and not deterministic. As a result, predictive analytics will not be able to ensure certainty of outcomes." (Prashant Natarajan et al, "Demystifying Big Data and Machine Learning for Healthcare", 2017)

"Your machine-learning algorithm should answer a very specific question that tells you something you need to know and that can be answered appropriately by the data you have access to. The best first question is something you already know the answer to, so that you have a reference and some intuition to compare your results with. Remember: you are solving a business problem, not a math problem."(Prashant Natarajan et al, "Demystifying Big Data and Machine Learning for Healthcare", 2017)

🖍️Andrew Ng - Collected Quotes

"AI is not a panacea. It cannot solve all problems. And like every technological disruption before it (the steam engine, internal combustion, electricity), it will bring about disruption good and bad." (Andrew Ng, [blog post] 2018)

"Carrying out error analysis on a learning algorithm is like using data science to analyze an ML system’s mistakes in order to derive insights about what to do next. At its most basic, error analysis by parts tells us what component(s) performance is (are) worth the greatest effort to improve." (Andrew Ng, "Machine Learning Yearning", 2018)

"In practice, increasing the size of your model will eventually cause you to run into computational problems because training very large models is slow. You might also exhaust your ability to acquire more training data. [...] Increasing the model size generally reduces bias, but it might also increase variance and the risk of overfitting. However, this overfitting problem usually arises only when you are not using regularization. If you include a well-designed regularization method, then you can usually safely increase the size of the model without increasing overfitting." (Andrew Ng, "Machine Learning Yearning", 2018)

"It is very difficult to know in advance what approach will work best for a new problem. Even experienced machine learning researchers will usually try out many dozens of ideas before they discover something satisfactory." (Andrew Ng, "Machine Learning Yearning", 2018)

"Keep in mind that artificial data synthesis has its challenges: it is sometimes easier to create synthetic data that appears realistic to a person than it is to create data that appears realistic to a computer." (Andrew Ng, "Machine Learning Yearning", 2018)

"AI is the new electricity: even with its current limitations, it is already transforming multiple industries. (Andrew Ng, [blog post] 2018)

"Artificial Intelligence can't solve all the world's problems, but it can help us with some of the biggest ones." (Andrew Ng) [attributed]

"Artificial Intelligence is a tool to help us be better humans, to help us get through the world more easily and richer and be more productive and engaging." (Andrew Ng) [attributed]

"If you can collect really large datasets, the algorithms often don't matter." (Andrew Ng) [attributed]

"Missing data is an opportunity, not a limitation." (Andrew Ng) [attributed]

"No one knows what the right algorithm is, but it gives us hope that if we can discover some crude approximation of whatever this algorithm is and implement it on a computer, that can help us make a lot of progress." (Andrew Ng) [attributed]

"Real-world problems are messy, and they rarely fit exactly into one category or another." (Andrew Ng) [attributed]

"The ability to innovate and to be creative are teachable processes. There are ways by which people can systematically innovate or systematically become creative." (Andrew Ng) [attributed]

"The key to AI success is not just having the right algorithms, but also having the right data to train those algorithms." (Andrew Ng) [attributed]

"The more data we can feed into the algorithms, the better models we can build." (Andrew Ng) [attributed]

🖍️John D Barrow - Collected Quotes

"Each of the most basic physical laws that we know corresponds to some invariance, which in turn is equivalent to a collection of changes which form a symmetry group. […] whilst leaving some underlying theme unchanged. […] for example, the conservation of energy is equivalent to the invariance of the laws of motion with respect to translations backwards or forwards in time […] the conservation of linear momentum is equivalent to the invariance of the laws of motion with respect to the position of your laboratory in space, and the conservation of angular momentum to an invariance with respect to directional orientation [...] discovery of conservation laws indicated that Nature possessed built-in sustaining principles which prevented the world from just ceasing to be."  (John D Barrow, "New Theories of Everything: The Quest for Ultimate Explanation", 1991)

"Everywhere […] in the Universe, we discern that closed physical systems evolve in the same sense from ordered states towards a state of complete disorder called thermal equilibrium. This cannot be a consequence of known laws of change, since […] these laws are time symmetric- they permit […] time-reverse. […] The initial conditions play a decisive role in endowing the world with its sense of temporal direction. […] some prescription for initial conditions is crucial if we are to understand […]" (John D Barrow, "New Theories of Everything: The Quest for Ultimate Explanation", 1991)

"In practice, the intelligibility of the world amounts to the fact that we find it to be algorithmically compressible. We can replace sequences of facts and observational data by abbreviated statements which contain the same information content. These abbreviations we often call 'laws of Nature.' If the world were not algorithmically compressible, then there would exist no simple laws of nature. Instead of using the law of gravitation to compute the orbits of the planets at whatever time in history we want to know them, we would have to keep precise records of the positions of the planets at all past times; yet this would still not help us one iota in predicting where they would be at any time in the future. This world is potentially and actually intelligible because at some level it is extensively algorithmically compressible. At root, this is why mathematics can work as a description of the physical world. It is the most expedient language that we have found in which to express those algorithmic compressions."  (John D Barrow, "New Theories of Everything: The Quest for Ultimate Explanation", 1991)

"On this view, we recognize science to be the search for algorithmic compressions. We list sequences of observed data. We try to formulate algorithms that compactly represent the information content of those sequences. Then we test the correctness of our hypothetical abbreviations by using them to predict the next terms in the string. These predictions can then be compared with the future direction of the data sequence. Without the development of algorithmic compressions of data all science would be replaced by mindless stamp collecting - the indiscriminate accumulation of every available fact. Science is predicated upon the belief that the Universe is algorithmically compressible and the modern search for a Theory of Everything is the ultimate expression of that belief, a belief that there is an abbreviated representation of the logic behind the Universe's properties that can be written down in finite form by human beings."  (John D Barrow, "New Theories of Everything: The Quest for Ultimate Explanation", 1991)

"The goal of science is to make sense of the diversity of Nature."  (John D Barrow, "New Theories of Everything: The Quest for Ultimate Explanation", 1991)

"There is one qualitative aspect of reality that sticks out from all others in both profundity and mystery. It is the consistent success of mathematics as a description of the workings of reality and the ability of the human mind to discover and invent mathematical truths."  (John D Barrow, "New Theories of Everything: The Quest for Ultimate Explanation", 1991)

"Highly correlated brown and black noise patterns do not seem to have seem to have attractive counterparts in the visual arts. There, over-correlation is the order of the day, because it creates the same dramatic associations that we find in attractive natural scenery, or in the juxtaposition of symbols. Somehow, it is tediously predictable when cast in a one-dimensional medium, like sound." (John D Barrow, "The Artful Universe", 1995)

"Where there is life there is a pattern, and where there is a pattern there is mathematics." (John D Barrow, "The Artful Universe", 1995)

"The advent of small, inexpensive computers with superb graphics has changed the way many sciences are practiced, and the way that all sciences present the results of experiments and calculations." (John D Barrow, "Cosmic Imagery: Key Images in the History of Science", 2008)

🖍️Herbert F Spirer - Collected Quotes

"Clearly, the mean is greatly influenced by extreme values, but it can be appropriate for many situations where extreme values do not arise. To avoid misuse, it is essential to know which summary measure best reflects the data and to use it carefully. Understanding the situation is necessary for making the right choice. Know the subject!" (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

"'Garbage in, garbage out' is a sound warning for those in the computer field; it is every bit as sound in the use of statistics. Even if the “garbage” which comes out leads to a correct conclusion, this conclusion is still tainted, as it cannot be supported by logical reasoning. Therefore, it is a misuse of statistics. But obtaining a correct conclusion from faulty data is the exception, not the rule. Bad basic data (the 'garbage in') almost always leads to incorrect conclusions (the 'garbage out'). Unfortunately, incorrect conclusions can lead to bad policy or harmful actions." (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

"Graphic misrepresentation is a frequent misuse in presentations to the nonprofessional. The granddaddy of all graphical offenses is to omit the zero on the vertical axis. As a consequence, the chart is often interpreted as if its bottom axis were zero, even though it may be far removed. This can lead to attention-getting headlines about 'a soar' or 'a dramatic rise (or fall)'. A modest, and possibly insignificant, change is amplified into a disastrous or inspirational trend." (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

"If you want to show the growth of numbers which tend to grow by percentages, plot them on a logarithmic vertical scale. When plotted against a logarithmic vertical axis, equal percentage changes take up equal distances on the vertical axis. Thus, a constant annual percentage rate of change will plot as a straight line. The vertical scale on a logarithmic chart does not start at zero, as it shows the ratio of values (in this case, land values), and dividing by zero is impossible." (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

"In analyzing data, more is not necessarily better. Unfortunately, it is not always possible to have one uniquely correct procedure for analyzing a given data set. An investigator may use several different methods of statistical analysis on a data set. Furthermore, different outcomes may result from the use of different analytical methods. If more than one conclusion results, then an investigator is committing a misuse of statistics unless the investigator shows and reconciles all the results. If the investigator shows only one conclusion or interpretation, ignoring the alternative procedure(s), the work is a misuse of statistics." (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

"It is a consequence of the definition of the arithmetic mean that the mean will lie somewhere between the lowest and highest values. In the unrealistic and meaningless case that all values which make up the mean are the same, all values will be equal to the average. In an unlikely and impractical case, it is possible for only one of many values to be above or below the average. By the very definition of the average, it is impossible for all values to be above average in any case." (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

"It is a major statistical sin to show a graph displaying a variable as a function of time with the vertical (left-hand) scale cut short so that it does not go down to zero, without drawing attention to this fact. This sin can create a seriously misleading impression, and, as they do with most sins, sinners commit it again and again." (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

"It is a misuse of statistics to use whichever set of statistics suits the purpose at hand and ignore the conflicting sets and the implications of the conflicts." (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

"Jargon and complex methodology have their place. But true professional jargon is merely a shorthand way of speaking. Distrust any jargon that cannot be translated into plain English. Sophisticated methods can bring unique insights, but they can also be used to cover inadequate data and thinking. Good analysts can explain their methods in simple, direct terms. Distrust anyone who can't make clear how they have treated the data." (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

"Know the subject matter, learn it fast, or get a trustworthy expert. To identify the unknown, you must know the known. But don't be afraid to challenge experts on the basis of your logical reasoning. Sometimes a knowledge of the subject matter can blind the expert to the novel or unexpected." (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

"Percentages seem to invite misuse, perhaps because they require such careful thinking." (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

"There is no shortage of statistical methods. Elementary statistics textbooks list dozens, and statisticians constantly develop and report new ones. But if a researcher uses the wrong method, a clear misuse, to analyze a specific set of data, then the results may be incorrect." (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

"When an analyst selects the wrong tool, this is a misuse which usually leads to invalid conclusions. Incorrect use of even a tool as simple as the mean can lead to serious misuses. […] But all statisticians know that more complex tools do not guarantee an analysis free of misuses. Vigilance is required on every statistical level."  (Herbert F Spirer et al, "Misused Statistics" 2nd Ed, 1998)

01 April 2006

🖍️Alfred R Ilersic - Collected Quotes

"Diagrams are sometimes used, not merely to convey several pieces of information such as several time series on one chart, but also to provide visual evidence of relationships between the series." (Alfred R Ilersic, "Statistics", 1959)

"Everybody has some idea of the meaning of the term 'probability' but there is no agreement among scientists on a precise definition of the term for the purpose of scientific methodology. It is sufficient for our purpose, however, if the concept is interpreted in terms of relative frequency, or more simply, how many times a particular event is likely to occur in a large population." (Alfred R Ilersic, "Statistics", 1959)

"However informative and well designed a statistical table may be, as a medium for conveying to the reader an immediate and clear impression of its content, it is inferior to a good chart or graph. Many people are incapable of comprehending large masses of information presented in tabular form; the figures merely confuse them. Furthermore, many such people are unwilling to make the effort to grasp the meaning of such data. Graphs and charts come into their own as a means of conveying information in easily comprehensible form." (Alfred R Ilersic, "Statistics", 1959)

"In brief, the greatest care must be exercised in using any statistical data, especially when it has been collected by another agency. At all times, the statistician who uses published data must ask himself, by whom were the data collected, how and for what purpose?" (Alfred R Ilersic, "Statistics", 1959)

"It is a good rule to remember that the first step in analyzing any statistical data, whether it be culled from an official publication or a report prepared by someone else, is to check the definitions used for classification." (Alfred R Ilersic, "Statistics", 1959)

"It is helpful to remember when dealing with index numbers that they are specialized tools and as such are most efficient and useful when properly used. A screwdriver is a poor substitute for a chisel, although it may be used as such. All index numbers are designed to measure particular groups of related changes." (Alfred R Ilersic, "Statistics", 1959)

"Most people tend to think of values and quantities expressed in numerical terms as being exact figures; much the same as the figures which appear in the trading account of a company. It therefore comes as a considerable surprise to many to learn that few published statistics, particularly economic and sociological data, are exact. Many published figures are only approximations to the real value, while others are estimates of aggregates which are far too large to be measured with precision." (Alfred R Ilersic, "Statistics", 1959)

"Numerical data, which have been recorded at intervals of time, form what is generally described as a time series. [...] The purpose of analyzing time series is not always the determination of the trend by itself. Interest may be centered on the seasonal movement displayed by the series and, in such a case, the determination of the trend is merely a stage in the process of measuring and analyzing the seasonal variation. If a regular basic or under- lying seasonal movement can be clearly established, forecasting of future movements becomes rather less a matter of guesswork and more a matter of intelligent forecasting." (Alfred R Ilersic, "Statistics", 1959)

"Often, in order to simplify statistical tables, the practice of rounding large figures and totals is resorted to. Where the constituent figures in a table together with their aggregate have been so treated, a discrepancy between the rounded total and the true sum of the rounded constituent figures frequently arises. Under no circumstances should the total be adjusted to what appears to be the right answer. A note to the table to the effect that the figures have been rounded, e.g. to the nearest 1,000, is all that is necessary. The same remark applies to percentage equivalents of the constituent parts of a total; it they do not add to exactly 100 per cent, leave them." (Alfred R Ilersic, "Statistics", 1959)

"Poor statistics may be attributed to a number of causes. There are the mistakes which arise in the course of collecting the data, and there are those which occur when those data are being converted into manageable form for publication. Still later, mistakes arise because the conclusions drawn from the published data are wrong. The real trouble with errors which arise during the course of collecting the data is that they are the hardest to detect." (Alfred R Ilersic, "Statistics", 1959)

"Statistical method consists of two main operations; counting and analysis. [...] The statistician has no use for information that cannot be expressed numerically, nor generally speaking, is he interested in isolated events or examples. The term 'data  is itself plural and the statistician is concerned with the analysis of aggregates. " (Alfred R Ilersic, "Statistics", 1959)

"The averaging of percentages themselves requires care, where the percentages are each computed on different bases, i.e. different quantities. The average is not derived by aggregating the percentages and dividing them. Instead of this, each percentage must first be multiplied by its base to bring out its relative significance to the other percentages and to the total. The sum of the resultant products is then divided by the sum of the base values [...], not merely the number of items." (Alfred R Ilersic, "Statistics", 1959)

"The rounding of individual values comprising an aggregate can give rise to what are known as unbiased or biased errors. [...]The biased error arises because all the individual figures are reduced to the lower 1,000 [...] The unbiased error is so described since by rounding each item to the nearest 1,000 some of the approximations are greater and some smaller than the original figures. Given a large number of such approximations, the final total may therefore correspond very closely to the true or original total, since the approximations tend to offset each other. [...] With biased approximations, however, the errors are cumulative and their aggregate increases with the number of items in the series." (Alfred R Ilersic, "Statistics", 1959)

"The simplest way of indicating that figures are not given precisely to the last unit is to express them to the nearest 100 or 1,000; or in some cases to the nearest 100,000 or million. [...] The widespread desire for precision is reflected in many reports on economic trends which quote figures in great detail, rather than emphasizing the trends and movements reflected in the figures." (Alfred R Ilersic, "Statistics", 1959)

"The statistician has no use for information that cannot be expressed numerically, nor generally speaking, is he interested in isolated events or examples. The term ' data ' is itself plural and the statistician is concerned with the analysis of aggregates." (Alfred R Ilersic, "Statistics", 1959)

"The statistics themselves prove nothing; nor are they at any time a substitute for logical thinking. There are […] many simple but not always obvious snags in the data to contend with. Variations in even the simplest of figures may conceal a compound of influences which have to be taken into account before any conclusions are drawn from the data." (Alfred R Ilersic, "Statistics", 1959)

"There are good statistics and bad statistics; it may be doubted if there are many perfect data which are of any practical value. It is the statistician's function to discriminate between good and bad data; to decide when an informed estimate is justified and when it is not; to extract the maximum reliable information from limited and possibly biased data." (Alfred R Ilersic, "Statistics", 1959)

"This is the essential characteristic of a logarithmic scale. Any given increase, regardless of its absolute size, is related to a given base quantity. Thus, a perfectly straight line on such a graph denotes a constant percentage rate of increase, and not a constant absolute increase. It is the slope of the line or curve which is significant in such a graph. The steeper the slope, whether it be downwards or upwards, the more marked is the rate of change." (Alfred R Ilersic, "Statistics", 1959)

"This type of graph possesses a number of advantages. It is possible to graph a number of series of widely differing magnitudes on a single chart and bring out any relationship between their movements. How- ever wide the amplitude of the fluctuations in the series, a logarithmic scale reduces them to manageable size on a single sheet of graph paper, whereas, on a normal scale, it might prove impossible to get the larger fluctuations on to a single chart, except by so reducing the scale that all the other smaller movements in the series are almost obliterated." (Alfred R Ilersic, "Statistics", 1959)

"Time series analysis often requires more knowledge of the data and relevant information about their background than it does of statistical techniques. Whereas the data in some other fields may be controlled so as to increase their representativeness, economic data are so changeable in their nature that it is usually impossible to sort out the separate effects of the various influences. Attempts to isolate cyclical, seasonal and irregular, or random movements, are made primarily in the hope that some underlying pattern of change over time may be revealed."  (Alfred R Ilersic, "Statistics", 1959)

"When using estimated figures, i.e. figures subject to error, for further calculation make allowance for the absolute and relative errors. Above all, avoid what is known to statisticians as 'spurious' accuracy. For example, if the arithmetic Mean has to be derived from a distribution of ages given to the nearest year, do not give the answer to several places of decimals. Such an answer would imply a degree of accuracy in the results of your calculations which are quite un- justified by the data. The same holds true when calculating percentages." (Alfred R Ilersic, "Statistics", 1959)

"While it is true to assert that much statistical work involves arithmetic and mathematics, it would be quite untrue to suggest that the main source of errors in statistics and their use is due to inaccurate calculations." (Alfred R Ilersic, "Statistics", 1959)

Book available on Archive.org.

🖍️Charles Livingston - Collected Quotes

"Cautions about combining groups: apples and oranges. In computing an average, be careful about combining groups in which the average for each group is of more interest than the overall average. […] Avoid combining distinct quantities in a single average." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"Central tendency is the formal expression for the notion of where data is centered, best understood by most readers as 'average'. There is no one way of measuring where data are centered, and different measures provide different insights." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"Concluding that the population is becoming more centralized by observing behavior at the extremes is called the 'Regression to the Mean' Fallacy. […] When looking for a change in a population, do not look only at the extremes; there you will always find a motion to the mean. Look at the entire population." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"Data often arrive in raw form, as long lists of numbers. In this case your job is to summarize the data in a way that captures its essence and conveys its meaning. This can be done numerically, with measures such as the average and standard deviation, or graphically. At other times you find data already in summarized form; in this case you must understand what the summary is telling, and what it is not telling, and then interpret the information for your readers or viewers." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"If a hypothesis test points to rejection of the alternative hypothesis, it might not indicate that the null hypothesis is correct or that the alternative hypothesis is false." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"Limit a sentence to no more than three numerical values. If you've got more important quantities to report, break those up into other sentences. More importantly, however, make sure that each number is an important piece of information. Which are the important numbers that truly advance the story?" (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"Numbers are often useful in stories because they record a recent change in some amount, or because they are being compared with other numbers. Percentages, ratios and proportions are often better than raw numbers in establishing a context." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"Probability is sometimes called the language of statistics. […] The probability of an event occurring might be described as the likelihood of it happening. […] In a formal sense the word "probability" is used only when an event or experiment is repeatable and the long term likelihood of a certain outcome can be determined." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"Roughly stated, the standard deviation gives the average of the differences between the numbers on the list and the mean of that list. If data are very spread out, the standard deviation will be large. If the data are concentrated near the mean, the standard deviation will be small." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"The basic idea of going from an estimate to an inference is simple. Drawing the conclusion with confidence, and measuring the level of confidence, is where the hard work of professional statistics comes in." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"The central limit theorem […] states that regardless of the shape of the curve of the original population, if you repeatedly randomly sample a large segment of your group of interest and take the average result, the set of averages will follow a normal curve." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"The dual meaning of the word significant brings into focus the distinction between drawing a mathematical inference and practical inference from statistical results." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"The percentage is one of the best (mathematical) friends a journalist can have, because it quickly puts numbers into context. And it's a context that the vast majority of readers and viewers can comprehend immediately." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.