30 January 2007

🌁Software Engineering: Application Programming Interface (Definitions)

"A set of routines available in an application for use by software programmers when designing an application interface. An excellent example is ADO. APIs make it simple for programmers to add powerful integration capabilities within an application. APIs shield programmers from many of the complexities of coding." (Anthony Sequeira & Brian Alderman, "The SQL Server 2000 Book", 2003)

"A set of routines that an application uses to request and carry out lower-level services." (Evan Levy & Jill Dyché, "Customer Data Integration", 2006)

"Represents a set of functions and methods that other developers can use to access the functionality within another application." (Sara Morganand & Tobias Thernstrom , "MCITP Self-Paced Training Kit: Designing and Optimizing Data Access by Using Microsoft SQL Server 2005 - Exam 70-442", 2007)

"This is a well-defined interface provided by an application or service to support requests and communications from other applications." (Michael Coles, "Pro T-SQL 2008 Programmer's Guide", 2008)

"A set of public programmatic interfaces that consists of a language and a message format to communicate with an operating system or other programmatic environment, such as databases, Web servers, and so forth. These messages typically call functions and methods available for application development." (David Lyle & John G Schmidt, "Lean Integration", 2010)

"An interface implemented by a software program to enable interaction with other software, in much the same way that a user interface facilitates interaction between humans and computers. APIs are implemented by applications, libraries, and operating systems to determine the vocabulary and calling conventions the programmer should employ to use their services. It may include specifications for routines, data structures, object classes, and protocols used to communicate between the consumer and implementer of the API." (Mark S Merkow & Lakshmikanth Raghavan, "Secure and Resilient Software Development", 2010)

"A toll that allows programs to talk to or interact with one another." (Linda Volonino & Efraim Turban, "Information Technology for Management 8th Ed", 2011)

"A published standard format for communicating with application programs." (Craig S Mullins, "Database Administration: The Complete Guide to DBA Practices and Procedures" 2nd Ed, 2012)

"A set of routines that an application uses to request and carry out lower-level services performed by a computer's operating system. These routines usually carry out maintenance tasks such as managing files and displaying information." (Microsoft, "SQL Server 2012 Glossary", 2012)

"This is a well-defined interface provided by an application or service to support requests and communications from other applications." (Jay Natarajan et al, "Pro T-SQL 2012 Programmer's Guide" 3rd Ed, 2012)

"A way of standardizing the connection between two software applications. It is essentially a standard hook that an application uses to connect to another software application." (Robert F Smallwood, "Information Governance: Concepts, Strategies, and Best Practices", 2014)

"A well-defined interface provided by an application or service to support requests and communications from other applications." (Miguel Cebollero et al, "Pro T-SQL Programmer’s Guide" 4th Ed, 2015)

"A set of definitions, protocols, and tools used to build software. Especially important to define when different software parts need to work together." (Pamela Schure & Brian Lawley, "Product Management For Dummies", 2017)

"A set of definitions, protocols, and tools for building software applications, and for allowing software components from different sources to communicate with each other." (Jonathan Ferrar et al, "The Power of People", 2017)

"An interface that allows an application program that is written in a high-level language to use specific data or functions of the operating system or another program." (Sybase, "Open Server Server-Library/C Reference Manual", 2019)

🌁Software Engineering: Object-Oriented Design (Definitions)

"The process of designing a computer application that utilizes OOP concepts in the design to show active objects that are to be developed." (Greg Perry, "Sams Teach Yourself Beginning Programming in 24 Hours" 2nd Ed., 2001)

"The specification of a logical software solution in terms of software objects, such as their classes, attributes, methods, and collaborations." (Craig Larman, "Applying UML and Patterns", 2004)

"The craft of partitioning the system into objects, organizing the objects into class hierarchies, and devising messages that communicate between the objects. See the Bibliography for references on this subject." (James Robertson et al, "Complete Systems Analysis: The Workbook, the Textbook, the Answers", 2013)

"A modular approach to system design in which functions are logically grouped together along with their data structures into objects. These objects generally correspond to logical real-world entities and interact with other objects through well-defined interfaces and hide their internal data structures to protect them from error by objects that have no need to know the internal workings of the object." (O Sami Saydjari, "Engineering Trustworthy Systems: Get Cybersecurity Design Right the First Time", 2018)

"A software engineering approach that models a system as a group of interacting objects. Each object represents some entity of interest in the system being modeled, and is characterized by its class, its state (data elements), and its behavior. OOAD encompasses Object-oriented analysis (OOA) and Object-oriented design (OOD)." (IQBBA) 

27 January 2007

🌁Software Engineering: Conceptual Models (Definitions)

"A conceptual model is simply a framework or schematic to understand the interaction of workforce education and development systems with other variables in a society." (Jay W Rojewski, "International Perspectives on Workforce Education and Development", 2004)

"The conceptual model is a non-software specific description of the simulation model that is to be developed, describing the objectives, inputs, outputs, content, assumptions and simplifications of the model." (Stewart Robinson, "Simulation: The Practice of Model Development and Use", 2004) 

"A conceptual model is a mental image of a system, its components, its interactions. It lays the foundation for more elaborate models, such as physical or numerical models. A conceptual model provides a framework in which to think about the workings of a system or about problem solving in general. An ensuing operational model can be no better than its underlying conceptualization." (Henry N Pollack,"Uncertain Science … Uncertain World", 2005)

"A conceptual model is a qualitative description of 'some aspect of the behaviour of a natural system'. This description is usually verbal, but may also be accompanied by figures and graphs." (Howard S. Wheater et al., "Groundwater Modelling in Arid and Semi-Arid Areas, 2010) 

"[…] a conceptual model is a diagram connecting variables and constructs based on theory and logic that displays the hypotheses to be tested." (Mary Wolfinbarger Celsi et al, "Essentials of Business Research Methods", 2011)

"A conceptual model of an interactive application is, in summary: the structure of the application - the objects and their operations, attributes, and relation-ships; an idealized view of the how the application works – the model designers hope users will internalize; the mechanism by which users accomplish the tasks the application is intended to support." (Jeff Johnson & Austin Henderson, "Conceptual Models", 2011)

"Simply put, a conceptual model is a simplified representation of reality, devised for a certain purpose and seen from a certain point of view."(David W Emble & Bernhard Thalheim, "Handbook of Conceptual Modeling", 2012)

"A conceptual model is a framework that is initially used in research to outline the possible courses of action or to present an idea or thought. When a conceptual model is developed in a logical manner, it will provide a rigor to the research process." (N Elangovan & R Rajendran, "Conceptual Model: A Framework for Institutionalizing the Vigor in Business Research", 2015)

"Briefly, a conceptual model is the configuration of conceptual elements and the navigation between them. As such, a conceptual model is the foundation of the user interface of any interactive system." (Avi Parush, "Conceptual Design for Interactive Systems", 2015)

"A model or conceptual model is a schematic or representation that describes how something works." (James Padolsey, "Clean Code in JavaScript", 2020)

20 January 2007

🌁Software Engineering: Design Pattern (Definitions)

"The design patterns [...] are descriptions of communicating objects and classes that are customized to solve a general design problem in a particular context." (Erich Gamma et al, "Design Patterns: Elements of Reusable Object-Oriented Software", 1994)

"A recurring structure or approach to a solution." (Atul Apte, "Java Connector Architecture: Building Custom Connectors and Adapters", 2002)

"A ready-to-use solution for a frequent design problem, expressed as partial design, consisting of classes and associations. Each design pattern is embedded in a set of constraints." (Johannes Link & Peter Fröhlich, "Unit Testing in Java", 2003)

"A generalized solution to a commonly occurring problem. Design patterns are characterized by their name, applicability, scope, structure, behavior, and consequences." (Bruce P Douglass, "Real-Time Agility", 2009)

"A solution to a problem in a context. A code idiom or design structure that satisfies the needs of a frequently occurring problem, constraint, requirement, etc." (Dean Wampler & Alex Payne, "Programming Scala", 2009)

"A solution to a problem in a context. A code idiom or design structure that satisfies the needs of a frequently occurring problem, constraint, requirement, etc. The 'context' portion of the definition is important, as it specifies conditions when the pattern is an appropriate choice and when it isn’t." (Dean Wampler, "Functional Programming for Java Developers", 2011)

"A general term for pattern that includes not only algorithmic strategy patterns but also patterns related to overall code organization." (Michael McCool et al, "Structured Parallel Programming", 2012)

"In object-oriented programming, an arrangement of classes that interact to perform some common and useful task. Similar to an object-oriented algorithm." (Rod Stephens, "Beginning Software Engineering", 2015)

"Design and program templates created to facilitate the construction of robust and reliable programs by capturing knowledge about how to best construct such programs in various domains." (O Sami Saydjari, "Engineering Trustworthy Systems: Get Cybersecurity Design Right the First Time", 2018)

"Reusable, well-proven solution to recurring modeling/design problems or scenarios." (Panos Alexopoulos, "Semantic Modeling for Data", 2020)

🌁Software Engineering: Data/Information Hiding (Definitions)

"In programming, the practice of locating some parts of the system in software structures that are invisible (inaccessible) to others. Usually, the information so hidden includes details that the programmer considers inessential and those aspects of the system that result from design decisions that are somehow difficult or likely to change. Compare with abstraction, which is a category of techniques by which one can make decisions about what information to hide." (Bill Pribyl & Steven Feuerstein, "Learning Oracle PL/SQL", 2001)

"Hiding the state of a class in private member variables." (Jesse Liberty, "Sams Teach Yourself C++ in 24 Hours" 3rd Ed., 2001)

"The ability of an object to hide its members from other parts of a program to protect those members from accidental change." (Greg Perry, "Sams Teach Yourself Beginning Programming in 24 Hours" 2nd Ed., 2001)

"The ability of a class to hide the details about how it works from outside code." (Rod Stephens, "Start Here! Fundamentals of Microsoft .NET Programming", 2011)

"The practice of hiding the details of a module with the goal of controlling access to the details of the module" (Nell Dale & John Lewis, "Computer Science Illuminated" 6th Ed., 2015)

"Use of segregation in design decisions to protect software components from negatively interacting with each other. Commonly enforced through strict interfaces." (Adam Gordon, "Official (ISC)2 Guide to the CISSP CBK" 4th Ed., 2015)

"The practice of hiding details within a module with the goal of controlling access to the details from the rest of the system" (Nell Dale et al, "Object-Oriented Data Structures Using Java" 4th Ed., 2016)

"The dependability concept of not allowing functions access to data structures unless they are specifically designated to do so in the context of a module or object." (O Sami Saydjari, "Engineering Trustworthy Systems: Get Cybersecurity Design Right the First Time", 2018)

"Concealing the structure of some (potentially unstable) parts of a program behind a stable interface." (Karl Beecher, "Computational Thinking - A beginner's guide to problem-solving and programming", 2017)

"The intentional denial of access to operate directly on data without going through specified encapsulating procedures, which operate on the data in a well-controlled manner. See encapsulation." (O Sami Saydjari, "Engineering Trustworthy Systems: Get Cybersecurity Design Right the First Time", 2018)

15 January 2007

🌁Software Engineering: Agile (Definitions)

 "A software development process (life cycle model) that evolves a product in a series of rapid iterations (several weeks or less) using continuous involvement with end user representatives. Several variations exist." (Richard D Stutzke, "Estimating Software-Intensive Systems: Projects, Products, and Processes", 2005)

"A lightweight, minimal-ceremony approach to software and system development emphasizing product quality, meeting customer needs, team collaboration, and responsiveness to change." (Bruce P Douglass, "Real-Time Agility: The Harmony/ESW Method for Real-Time and Embedded Systems Development", 2009)

"Methodologies that describe rapid iterations to get to a target in software development." (Martin Oberhofer et al, "The Art of Enterprise Information Architecture", 2010)

"An umbrella term for several lightweight development processes and specific practices that are designed to minimize process waste, while improving code quality and communications with project stakeholders." (Dean Wampler, "Functional Programming for Java Developers", 2011)

"A framework for developing software or a product that focuses on small iteration cycles (usually two to four weeks), heavy stakeholder interaction, and delivery of functional product at each cycle." (Jason Williamson, Getting a Big Data Job For Dummies, 2015)

13 January 2007

🌁Software Engineering: Architecture Principles (Definitions)

 "Underlying guidelines that hold true across the architecture of multiple systems. These guidelines define the essence of the architecture by capturing the thinking behind it, and they provide a decision framework that enables the process of making decisions on the architecture." (Tilak Mitra et al, "SOA Governance", 2008)

"Architecture principles, policies, and guidelines define the underlying general rules and guidance that an organization will use to deploy business and IT resources and assets across the enterprise." (Allen Dreibelbis et al, "Enterprise Master Data Management", 2008)

"Policies and guidelines that define the underlying rules that an organization uses to deploy business and IT resources across the enterprise." (Martin Oberhofer et al, "The Art of Enterprise Information Architecture", 2010)

"Set of stable rules and recommendations concerning the architecture in its entirety." (Gilbert Raymond & Philippe Desfray, "Modeling Enterprise Architecture with TOGAF", 2014)

11 January 2007

🌁Software Engineering: Agile Development (Definitions)

[agile method:] "A software development process (life cycle model) that evolves a product in a series of rapid iterations (several weeks or less) using continuous involvement with end user representatives. Several variations exist." (Richard D Stutzke, "Estimating Software-Intensive Systems: Projects, Products, and Processes", 2005)

"An evolutionary and highly collaborative approach to development in which the focus is on delivering high-quality, tested software that meets the highest-priority needs of its stakeholders on a regular basis." (Pramod J Sadalage & Scott W Ambler, "Refactoring Databases: Evolutionary Database Design", 2006)

"An IT development philosophy that argues in favor of quick, incremental implementations that focus on small, combined teams of users and developers, and quick turnaround of small slices of functionality. See Bottom-up development." (Evan Levy & Jill Dyché, "Customer Data Integration", 2006)

"An adaptive, iterative method or framework for developing software." (Victor Isakov et al, "MCITP Administrator: Microsoft SQL Server 2005 Optimization and Maintenance (70-444) Study Guide", 2007)

"A philosophy that embraces uncertainty, encourages team communication, values customer satisfaction, vies for early delivery, and promotes sustainable development." (Pankaj Kamthan, "Pair Modeling", 2008)

[agile methods:] "a lightweight, minimal-ceremony approach to software and system development emphasizing product quality, meeting customer needs, team collaboration, and responsiveness to change." (Bruce P Douglass, "Real-Time Agility: The Harmony/ESW Method for Real-Time and Embedded Systems Development", 2009)

"A group of software development methodologies based on iterative development, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. Agile methods generally promote a disciplined project management process that encourages frequent inspection and adaptation, a leadership philosophy that encourages teamwork, self-organization, and accountability, a set of engineering best practices intended to allow for rapid delivery of high-quality software, and a business approach that aligns development with customer needs and company goals." (Mark S Merkow & Lakshmikanth Raghavan, "Secure and Resilient Software Development", 2010)

"A term coined in 2001 with the formulation of the Agile Manifesto to refer to software development methodologies based on iterative development where requirements and solutions evolve through collaboration among self-organizing, cross-functional teams." (Paulraj Ponniah, "Data Warehousing Fundamentals for IT Professionals", 2010)

"A development approach that focuses on building projects incrementally using frequent builds." (Rod Stephens, "Start Here! Fundamentals of Microsoft .NET Programming", 2011)

"A group of software development methodologies based on iterative and incremental development, where requirements and solutions evolve through collaboration between self-organizing, cross-functional teams." (DAMA International, "The DAMA Dictionary of Data Management", 2011)

"A group of software development methodologies based on iterative incremental development, where requirements and solutions evolve through collaboration between self-organizing cross-functional teams." (IQBBA, "Standard glossary of terms used in Software Engineering", 2011)

"An umbrella term for several lightweight development processes and specific practices that are designed to minimize process waste, while improving code quality and communications with project stakeholders." (Dean Wampler, "Functional Programming for Java Developers", 2011)

"The creation of working software through rapid iteration, focused on customer collaboration and self-organizing teams." (Jon Radoff, "Game On: Energize Your Business with Social Media Games", 2011)

"A method of software development that stresses quick development cycles; it is seen as an alternative to the 'waterfall' method." (Bill Holtsnider & Brian D Jaffe, "IT Manager's Handbook" 3rd Ed., 2012)

"A software-development methodology that emphasizes iterative and incremental development driven by cross-functional collaboration and co-location." (Evan Stubbs, "Delivering Business Analytics: Practical Guidelines for Best Practice", 2013)

"A development model where you initially provide the fewest possible features at the lowest fidelity to still have a useful application. Over time, you add more features and improve existing features until all features have been implemented at full fidelity." (Rod Stephens, "Beginning Software Engineering", 2015)

"An iterative and incremental approach to, typically, software development that involves short phases of work followed by checking that the work achieves the agreed upon goals and adapting to new information given market changes." (Pamela Schure & Brian Lawley, "Product Management For Dummies", 2017)

"A term used to describe a mindset of values and principles as set forth in the Agile Manifesto." (Project Management Institute, "Practice Standard for Scheduling" 3rd Ed., 2019)

"Agile is a rapid and flexible software development approach includes collaboration of self-organizing, cross-functional team and customer." (Sinemis Zengin, "Customer Centric Innovation in Banking Sector", Handbook of Research on Managerial Thinking in Global Business Economics, 2019)

 "A software development approach to convey rapid changes in the market and customer requirements on high-quality terms." (Fayez Salma & Jorge M Gómez, "Challenges and Trends of Agile", 2021)

"Agile is a development approach that delivers software in increments by following the principles of the Manifesto for Agile Software Development." (Gartner)

07 January 2007

🌁Software Engineering: Design (Definitions)

(1) The process of defining the architecture, components, interfaces, and other characteristics of a system or component. (2) The result of the process in (1)."  (IEEE," IEEE Standard Glossary of Software Engineering Terminology", 1990)

"A process that uses the products of analysis to produce a specification for implementing a system. A logical description of how a system will work." (Craig Larman, "Applying UML and Patterns", 2004)

"The activity of identifying and defining the architecture, components, interfaces, and attributes of a system or product. See also architectural design. (2) The result of the process in (1)." (Richard D Stutzke, "Estimating Software-Intensive Systems: Projects, Products, and Processes", 2005)

"Analysis discovers what needs to be done. Design figures out how what has been analyzed, can and should be done." (Gavin Powell, "Beginning Database Design", 2006)

"The act of crafting a technological solution to fit the requirements, within the constraints." (Suzanne Robertson & James Robertson, "Mastering the Requirements Process" 2nd Ed, 2006)

"The process of optimizing an analysis model through the selection of design technologies, decisions, and patterns." (Bruce P Douglass, "Real-Time Agility: The Harmony/ESW Method for Real-Time and Embedded Systems Development", 2009)

"1.A deliberate, purposeful plan, layout, delineation, arrangement, and specification of the component parts and interfaces of a product or system. A logical design is an abstract design for fulfilling requirements without consideration for physical constraints. A physical design considers the requirements along with physical constraints. 2.Verb. To conceive, plan, define, arrange, and specify a product or system." (DAMA International, "The DAMA Dictionary of Data Management", 2011)

"A plan or outline to accomplish goals and purpose based on evidence and analysis. Degree of detail depends on the situation, such as prototypes and storyboards." (Joan C Dessinger, "Fundamentals of Performance Improvement" 3rd Ed, 2012)

"The process of defining how a system will be implemented; the objective is to use the available technology to implement the essential requirements so that the implemented system looks as much like the essential system, and hence the problem, as possible." (James Robertson et al, "Complete Systems Analysis: The Workbook, the Textbook, the Answers", 2013)

"Design is an iterative process and the goal is to describe the system architecture that will satisfy the functional and non-functional requirements. It involves describing the system at a number of different levels of abstraction, with the designer starting off with an informal picture of the design that is then refined by adding more information." (Gerard O’Regan, "Concise Guide to Software Engineering: From Fundamentals to Application Methods", 2017)

"Activity or process that identifies requirements and then defines a solution that is able to meet these requirements" (ITIL)

03 January 2007

🌁Software Engineering: Algorithm (Definitions)

"(1) A finite set of well-defined rules for the solution of a problem in a finite number of steps; for example, a complete specification of a sequence of arithmetic operations for evaluating sine x to a given precision. (2) Any sequence of operations for performing a specific task."(IEEE, "IEEE Standard Glossary of Software Engineering Terminology", 1990)

"A computational procedure; a neural net training algorithm is a step by step procedure for setting the weights of the net. Training algorithms are also known as learning rules." (Laurene V Fausett, "Fundamentals of Neural Networks: Architectures, Algorithms, and Applications", 1994)

"A process or rule a machine uses for processing." (Patrick Dalton, "Microsoft SQL Server Black Book", 1997)

"A series of steps or expressions that solves a problem." (Microsoft Corporation, "Microsoft SQL Server 7.0 Data Warehouse Training Kit", 2000)

"A common procedure or step-by-step methodology for performing a specific task and producing desired results." (Greg Perry, "Sams Teach Yourself Beginning Programming in 24 Hours" 2nd Ed., 2001)

"open, cyclic, or arbitrarily structured sequence of exactly defined unconditional or conditional instructions." (Teuvo Kohonen, "Self-Organizing Maps" 3rd Ed., 2001)

"A set of statements organized to solve a problem in a finite number of steps." (Margaret Y Chu, "Blissful Data ", 2004)

"A set of rules for the solution of a problem in a finite number of steps; for example, a complete specification of a sequence of arithmetic operations for calculating a numeric value to a given precision." (Richard D Stutzke, "Estimating Software-Intensive Systems: Projects, Products, and Processes", 2005)

"A set of statements organized to solve a problem in a finite number of steps." (William H Inmon, "Building the Data Warehouse", 2005)

"A computer program (or procedure) that is a step-by-step procedure, solving a problem, in a finite number of steps." (Gavin Powell, "Beginning Database Design", 2006)

"An algorithm is a finite sequence of instructions for solving a particular problem or performing a task. In terms of cryptography, an algorithm is a step-by-step procedure for encrypting, decrypting, or calculating cryptographic hashes from data." (Michael Coles & Rodney Landrum, , "Expert SQL Server 2008 Encryption", 2008)

"A formula or procedure for solving a problem or carrying out a task. An algorithm is a set of steps in a very specific order, such as a mathematical formula or the instructions in a computer program." (J P Getty Trust, "Introduction to Metadata" 2nd Ed, 2008)

"Sets of steps, operations, or procedures that will produce a particular outcome; like a recipe." (Robert Nisbet et al, "Handbook of statistical analysis and data mining applications", 2009)

"A mathematical rule for solving a problem; a predetermined set of rules used to solve a problem in a finite number of steps." (Linda Volonino & Efraim Turban, "Information Technology for Management 8th Ed", 2011)

"A set of rules or steps that will result in a defined end from a defined start." (DAMA International, "The DAMA Dictionary of Data Management", 2011)

"A well-defined sequence of steps, explained clearly enough that even a computer could do them." (Jon Orwant et al, "Programming Perl" 4th Ed., 2012)

"A finite series of well-defined steps that achieve a desired outcome. These steps may be deterministic or include random or probabilistic elements." (Evan Stubbs, "Delivering Business Analytics: Practical Guidelines for Best Practice", 2013)

"A finite series of well-defined steps that achieve a desired outcome. These steps may be deterministic or include random or probabilistic elements." (Evan Stubbs, "Big Data, Big Innovation", 2014)

"The instructions that govern the flow of activity in a procedure" (Daniel Linstedt & W H Inmon, "Data Architecture: A Primer for the Data Scientist", 2014)

"A mathematical formula used to analyze data." (Jason Williamson, "Getting a Big Data Job For Dummies", 2015)

"A software recipe that explains how to solve a particular programming problem." (Rod Stephens, "Beginning Software Engineering", 2015)

"A step-by-step description of a specific process, procedure, or method." (Judith S Hurwitz, "Cognitive Computing and Big Data Analytics", 2015)

"Unambiguous instructions for solving a problem or subproblem in a finite amount of time using a finite amount of data" (Nell Dale & John Lewis, "Computer Science Illuminated" 6th Ed., 2015)

"A sequence of unambiguous instructions that solve a problem, within a finite amount of time, given a set of valid input" (Nell Dale et al, "Object-Oriented Data Structures Using Java" 4th Ed., 2016)

"Finite sequence of operations feasible, unambiguous, the execution gives a solution to a problem. Or what is easier to say, an instruction to do something. Featured on the new design processes to systematize computer graphics operations." (Mauro Chiarella, "Folds and Refolds: Space Generation, Shapes, and Complex Components", 2016)

"A sequence of clearly defined steps that describe a process to follow a finite set of unambiguous instructions with clear start and end points." (Karl Beecher, "Computational Thinking - A beginner's guide to problem-solving and programming", 2017)

"A step-by-step set of rules to follow in calculations to meet analytical objectives such as prediction or classification." (Jonathan Ferrar et al, "The Power of People: Learn How Successful Organizations Use Workforce Analytics To Improve Business Performance", 2017)

"A set of rules for calculating results or solving problems that have been programmed for use in a model-driven DSS." (Ciara Heavin & Daniel J Power, "Decision Support, Analytics, and Business Intelligence" 3rd Ed., 2017)

"A set of computational rules to be followed to solve a mathematical problem. More recently, the term has been adopted to refer to a process to be followed, often by a computer." (Soraya Sedkaoui, "Big Data Analytics for Entrepreneurial Success", 2018)

"A step-by-step recipe that you follow to achieve a goal, not unlike baking a cake." (Terrence J Sejnowski, "The Deep Learning Revolution", 2018)

"An algorithm is a procedure that solves a given problem by a finite number of steps. A problem solved by an algorithm is said computable ." (Crescenzio Gallo, "Building Gene Networks by Analyzing Gene Expression Profiles", 2018)

"a rule or formula that takes input variables and produces an output, such as a prediction, a classification, or a probability" ((David Spiegelhalter, "The Art of Statistics: Learning from Data", 2019)

"An algorithm is a well-defined procedure that allows a computer to solve a problem. A particular problem can typically be solved by more than one algorithm. Optimization is the process of finding the most efficient algorithm for a given task." (Edward T Chen, "Deep Learning and Sustainable Telemedicine", 2020)

"An algorithm is an ordered, accurate step-by-step process for a problem that provides a solution in a finite number of steps and that is unambiguous." (Hari K Kondaveeti et al, "Deep Learning Applications in Agriculture: The Role of Deep Learning in Smart Agriculture", 2021)

"Rules that allow AI to learn patterns in the data, classify, and to predict." (Sujata Ramnarayan, "Marketing and Artificial Intelligence: Personalization at Scale", 2021)

 "A set of rules and operations optimized for a specific outcome." (Forrester)

01 January 2007

🌁Software Engineering: Architecture (Definitions)

"The organizational structure of a system or component." (IEEE, "IEEE Standard Glossary of Software Engineering Terminology", 1990)

"The structure (components, connections, and constraints) of a product, process, or element. The architecture of a particular application is defined by the classes and the interrelation of the classes. At another level, the architecture of a system is determined by the arrangement of the hardware and software components. The terms logical architecture and physical architecture are often used to emphasize this distinction." (Atul Apte, "Java Connector Architecture: Building Custom Connectors and Adapters", 2002)

"A term used to designate the structure/foundation of a computer system and its applications." (Margaret Y Chu, "Blissful Data ", 2004)

"A framework defining key elements of a product that includes the hardware (platform) and software components, their partitioning (structures, arrangement, and relations), and the rules governing the interactions (data transfer, control, error handling) between these components, and between these components and external entities (users and other systems). Partitioning describes the (static) structure and relations between the components, including (a) which software components reside on which hardware components, and (b) the nature and scope of each component (i.e., the functions and data allocated to each component). The rules for the dynamic interaction of the components include error detection, propagation, and handling; interprocess communication and synchronization; human/machine interactions ('look and feel'); fundamental data elements (definition, allowed values, range, precision, default value, internal representation, units of measure; standard coordinate systems; internal units of measure; and physical models. May optionally define design and construction constraints (e.g., specific COTS components to be used)." (Richard D Stutzke, "Estimating Software-Intensive Systems: Projects, Products, and Processes", 2005)

"The science or art of building. This includes the designing and planning process that occurs in problem solving." (Sharon Allen & Evan Terry, "Beginning Relational Data Modeling" 2nd Ed., 2005)

"In information processing, the design approach taken for developing a program or system." (Judith Hurwitz et al, "Service Oriented Architecture For Dummies" 2nd Ed., 2009)

"The specification of the largest-scale design optimization decisions for a system. This is divided into five primary views: subsystem and component architecture, concurrency and resource architecture, distribution architecture, safety and reliability architecture, and deployment architecture." (Bruce P Douglass, "Real-Time Agility", 2009)

[reference architecture:] "Provides a proven template of an architecture for a particular domain that contains the supporting artifacts to enable their reuse." (Martin Oberhofer et al, "The Art of Enterprise Information Architecture", 2010)

"An organized set of consensus decisions on policies, principles, services, common solutions, standards, and guidelines as well as specific vendor products and technologies used to provide information technology (IT)." (Craig S Mullins, "Database Administration", 2012)

"In information processing, the design approach taken in developing a program or system. " (Marcia Kaufman et al, "Big Data For Dummies", 2013)

[application architecture:] "Enterprise architecture domain focused on the logical knowledge of applications, their links, and their positioning in the system. By extension, the logical structure of the IS, which can include SOA components, data repositories, or elements to interface with the outside world." (Gilbert Raymond & Philippe Desfray, "Modeling Enterprise Architecture with TOGAF", 2014)

"Reference architecture serves as a blueprint for all enterprise information management (EIM) solutions in an enterprise. Therefore, it can be seen as the big picture view of information management for a given enterprise. The reference architecture has all the relevant solution components needed to build an end-to-end EIM solution for the enterprise and includes the layers - information sourcing, master information management, information integration and exchange, information warehousing, and reservoir and information delivery and consumption." (Saumya Chaki, "Enterprise Information Management in Practice", 2015)

"1.A formal description of a system, or a detailed plan of the system at component level, to guide its implementation (source: ISO/IEC 42010:2007). 2.The structure of components, their inter-relationships, and the principles and guidelines governing their design and evolution over time." (by Brian Johnson & Leon-Paul de Rouw, "Collaborative Business Design", 2017)

"The way the component parts of an entity are arranged, organized, and managed." (William Stallings, "Effective Cybersecurity: A Guide to Using Best Practices and Standards", 2018)

"The architecture of a software system (at a given point in time) is its organization or structure of significant components interacting through interfaces; these components comprise successively smaller components and interfaces." (Bruce MacIsaac & Per Kroll, "Agility and Discipline Made Easy: Practices from OpenUP and RUP", 2006)

"The fundamental organization of a system, embodied in its components, their relationships to each other and the environment, and the principles governing its design and evolution." (ANSI/IEEE)

"The structure of a system or service, including the relationships of components to each other and to the environment they are in" (ITIL)

31 December 2006

✏️Danyel Fisher - Collected Quotes

"A dimension is an attribute that groups, separates, or filters data items. A measure is an attribute that addresses the question of interest and that the analyst expects to vary across the dimensions. Both the measures and the dimensions might be attributes directly found in the dataset or derived attributes calculated from the existing data." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"A well-operationalized task, relative to the underlying data, fulfills the following criteria: (1) Can be computed based on the data; (2) Makes specific reference to the attributes of the data; (3) Has a traceable path from the high-level abstract questions to a set of concrete, actionable tasks." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"An actionable task means that it is possible to act on its result. That action might be to present a useful result to a decision maker or to proceed to a next step in a different result. An answer is actionable when it no longer needs further work to make sense of it." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"Every dataset has subtleties; it can be far too easy to slip down rabbit holes of complications. Being systematic about the operationalization can help focus our conversations with experts, only introducing complications when needed." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"Color is difficult to use effectively. A small number of well-chosen colors can be highly distinguishable, particularly for categorical data, but it can be difficult for users to distinguish between more than a handful of colors in a visualization. Nonetheless, color is an invaluable tool in the visualization toolbox because it is a channel that can carry a great deal of meaning and be overlaid on other dimensions. […] There are a variety of perceptual effects, such as simultaneous contrast and color deficiencies, that make precise numerical judgments about a color scale difficult, if not impossible." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"Creating effective visualizations is hard. Not because a dataset requires an exotic and bespoke visual representation - for many problems, standard statistical charts will suffice. And not because creating a visualization requires coding expertise in an unfamiliar programming language [...]. Rather, creating effective visualizations is difficult because the problems that are best addressed by visualization are often complex and ill-formed. The task of figuring out what attributes of a dataset are important is often conflated with figuring out what type of visualization to use. Picking a chart type to represent specific attributes in a dataset is comparatively easy. Deciding on which data attributes will help answer a question, however, is a complex, poorly defined, and user-driven process that can require several rounds of visualization and exploration to resolve." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"Dashboards are a type of multiform visualization used to summarize and monitor data. These are most useful when proxies have been well validated and the task is well understood. This design pattern brings a number of carefully selected attributes together for fast, and often continuous, monitoring - dashboards are often linked to updating data streams. While many allow interactivity for further investigation, they typically do not depend on it. Dashboards are often used for presenting and monitoring data and are typically designed for at-a-glance analysis rather than deep exploration and analysis." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"Designing effective visualizations presents a paradox. On the one hand, visualizations are intended to help users learn about parts of their data that they don’t know about. On the other hand, the more we know about the users’ needs and the context of their data, the better we can design a visualization to serve them." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"Dimensionality reduction is a way of reducing a large number of different measures into a smaller set of metrics. The intent is that the reduced metrics are a simpler description of the complex space that retains most of the meaning. […] Clustering techniques are similarly useful for reducing a large number of items into a smaller set of groups. A clustering technique finds groups of items that are logically near each other and gathers them together." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"Maps also have the disadvantage that they consume the most powerful encoding channels in the visualization toolbox - position and size - on an aspect that is held constant. This leaves less effective encoding channels like color for showing the dimension of interest." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"[…] no single visualization is ever quite able to show all of the important aspects of our data at once - there just are not enough visual encoding channels. […] designing effective visualizations to make sense of data is not an art - it is a systematic and repeatable process." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"[…] the data itself can lead to new questions too. In exploratory data analysis (EDA), for example, the data analyst discovers new questions based on the data. The process of looking at the data to address some of these questions generates incidental visualizations - odd patterns, outliers, or surprising correlations that are worth looking into further." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"The field of [data] visualization takes on that goal more broadly: rather than attempting to identify a single metric, the analyst instead tries to look more holistically across the data to get a usable, actionable answer. Arriving at that answer might involve exploring multiple attributes, and using a number of views that allow the ideas to come together. Thus, operationalization in the context of visualization is the process of identifying tasks to be performed over the dataset that are a reasonable approximation of the high-level question of interest." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"The general concept of refining questions into tasks appears across all of the sciences. In many fields, the process is called operationalization, and refers to the process of reducing a complex set of factors to a single metric. The field of visualization takes on that goal more broadly: rather than attempting to identify a single metric, the analyst instead tries to look more holistically across the data to get a usable, actionable answer. Arriving at that answer might involve exploring multiple attributes, and using a number of views that allow the ideas to come together. Thus, operationalization in the context of visualization is the process of identifying tasks to be performed over the dataset that are a reasonable approximation of the high-level question of interest." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"The goal of operationalization is to refine and clarify the question until the analyst can forge an explicit link between the data that they can find and the questions they would like to answer. […] To achieve this, the analyst searches for proxies. Proxies are partial and imperfect representations of the abstract thing that the analyst is really interested in. […] Selecting and interpreting proxies requires judgment and expertise to assess how well, and with what sorts of limitations, they represent the abstract concept." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"The operationalization process is an iterative one and the end point is not precisely defined. The answer to the question of how far to go is, simply, far enough. The process is done when the task is directly actionable, using the data at hand. The analyst knows how to describe the objects, measures, and groupings in terms of the data - where to find it, how to compute, and how to aggregate it. At this point, they know what the question will look like and they know what they can do to get the answer." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"The intention behind prototypes is to explore the visualization design space, as opposed to the data space. A typical project usually entails a series of prototypes; each is a tool to gather feedback from stakeholders and help explore different ways to most effectively support the higher-level questions that they have. The repeated feedback also helps validate the operationalization along the way." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"Rapid prototyping is a process of trying out many visualization ideas as quickly as possible and getting feedback from stakeholders on their efficacy. […] The design concept of 'failing fast' informs this: by exploring many different possible visual representations, it quickly becomes clear which tasks are supported by which techniques." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"Too many simultaneous encodings will be overwhelming to the reader; colors must be easily distinguishable, and of a small enough number that the reader can interpret them."  (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

"Visualizations provide a direct and tangible representation of data. They allow people to confirm hypotheses and gain insights. When incorporated into the data analysis process early and often, visualizations can even fundamentally alter the questions that someone is asking." (Danyel Fisher & Miriah Meyer, "Making Data Visual", 2018)

✏️Dina Gray - Collected Quotes

"Although performance measurement is often linked to tools such as scorecards, dashboards, performance targets, indicators and information systems, it would be naïve to consider the measurement of performance as just a technical issue. Indeed, measurement is often used as a way of attempting to bring clarity to complex and confusing situations." (Dina Gray et al, "Measurement Madness: Recognizing and avoiding the pitfalls of performance measurement", 2015)

"'Big Data" is certainly changing the way organizations operate, and our capacity to do planning, budgeting and forecasting, as well as the management of our processes and supply chains, has radically improved. However, greater availability of data is also being accompanied by two major challenges: firstly, many managers are now required to develop data-oriented management systems to make sense of the phenomenal amount of data their organizations and their main partners are producing. Secondly, whilst the volume of data that we now have access to is certainly seductive and potentially very useful, it can also be overwhelming." (Dina Gray et al, "Measurement Madness: Recognizing and avoiding the pitfalls of performance measurement", 2015)

"[...] introducing an excessive number of measures is only the start of the problem. The other is that measures tend to stick, unless questioned and revised. As the world changes, so does the environment in which an organization operates. Priorities change, new drivers of performance emerge, and different operating models are employed. It would therefore make sense that the performance measurement system is also revised to reflect these changes." (Dina Gray et al, "Measurement Madness: Recognizing and avoiding the pitfalls of performance measurement", 2015)

"Measurement is often associated with the objectivity and neatness of numbers, and performance measurement efforts are typically accompanied by hope, great expectations and promises of change; however, these are then often followed by disbelief, frustration and what appears to be sheer madness." (Dina Gray et al, "Measurement Madness: Recognizing and avoiding the pitfalls of performance measurement", 2015)

"Measurement is often seen as a tool that helps reduce the complexity of the world. Organizations, with their uncertainty and confusion, are full of people, patterns and trends; and measurement seems to offer a promise of bringing order, rationality and control into this chaos." (Dina Gray et al, "Measurement Madness: Recognizing and avoiding the pitfalls of performance measurement", 2015)

"One of the most puzzling things about performance measurement is that, regardless of the countless negative experiences, as well as a constant stream of similar failures reported in the media, organizations continue to apply the same methods and constantly fall into the same traps. This is because commonly held beliefs about the measurement and management of performance are rarely challenged." (Dina Gray et al, "Measurement Madness: Recognizing and avoiding the pitfalls of performance measurement", 2015)

"Performance measures by themselves are simply tools that may or may not be used by managers and staff. However, if your organization has an addiction to measurement, sooner or later people will start relying on measures excessively, and common sense will gradually begin to be replaced by the measures themselves leading the organization into the eye of the measurement madness hurricane." (Dina Gray et al, "Measurement Madness: Recognizing and avoiding the pitfalls of performance measurement", 2015)

"Regularly, and unfortunately more often than might be expected, organizations can become so fixated on the narrow task of measuring and reporting performance that measures lose their meaning, and no one relies on them for real decision-making. [...] More worryingly, sometimes performance measures are introduced without any intention of providing meaningful data for making decisions in the first place. In this case, such indicators are often treated with contempt." (Dina Gray et al, "Measurement Madness: Recognizing and avoiding the pitfalls of performance measurement", 2015)

"Since perfect measures of performance do not exist, organizations use proxies - indicators that approximate or represent performance in the absence of perfect measures. [...] Over time, proxies are perceived to rep￾resent true performance." (Dina Gray et al, "Measurement Madness: Recognizing and avoiding the pitfalls of performance measurement", 2015)

"When all you see and believe is numbers, it becomes increasingly difficult to decide when to react and intervene. [...] The most obvious course of action is to set aside the numbers and try to understand the underlying causes of these changes. However, the over-reliance on measurement instead drives many managers to design 'thresholds' or 'colour codes' for numbers, thus adding another layer of abstraction to measurement and keeping these managers firmly desensitized to the meaning of per￾formance information." (Dina Gray et al, "Measurement Madness: Recognizing and avoiding the pitfalls of performance measurement", 2015)

✏️Edward R Tufte - Collected Quotes

"A good rule of thumb for deciding how long the analysis of the data actually will take is (1) to add up all the time for everything you can think of - editing the data, checking for errors, calculating various statistics, thinking about the results, going back to the data to try out a new idea, and (2) then multiply the estimate obtained in this first step by five." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"Almost all efforts at data analysis seek, at some point, to generalize the results and extend the reach of the conclusions beyond a particular set of data. The inferential leap may be from past experiences to future ones, from a sample of a population to the whole population, or from a narrow range of a variable to a wider range. The real difficulty is in deciding when the extrapolation beyond the range of the variables is warranted and when it is merely naive. As usual, it is largely a matter of substantive judgment - or, as it is sometimes more delicately put, a matter of 'a priori nonstatistical considerations'." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"[…] fitting lines to relationships between variables is often a useful and powerful method of summarizing a set of data. Regression analysis fits naturally with the development of causal explanations, simply because the research worker must, at a minimum, know what he or she is seeking to explain." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"Fitting lines to relationships between variables is the major tool of data analysis. Fitted lines often effectively summarize the data and, by doing so, help communicate the analytic results to others. Estimating a fitted line is also the first step in squeezing further information from the data." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"If two or more describing variables in an analysis are highly intercorrelated, it will be difficult and perhaps impossible to assess accurately their independent impacts on the response variable. As the association between two or more describing variables grows stronger, it becomes more and more difficult to tell one variable from the other. This problem, called 'multicollinearity' in the statistical jargon, sometimes causes difficulties in the analysis of nonexperimental data. […] No statistical technique can go very far to remedy the problem because the fault lies basically with the data rather than the method of analysis. Multicollinearity weakens inferences based on any statistical method - regression, path analysis, causal modeling, or cross-tabulations (where the difficulty shows up as a lack of deviant cases and as near-empty cells)."  (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"[…] it is not enough to say: 'There's error in the data and therefore the study must be terribly dubious'. A good critic and data analyst must do more: he or she must also show how the error in the measurement or the analysis affects the inferences made on the basis of that data and analysis." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"Logging size transforms the original skewed distribution into a more symmetrical one by pulling in the long right tail of the distribution toward the mean. The short left tail is, in addition, stretched. The shift toward symmetrical distribution produced by the log transform is not, of course, merely for convenience. Symmetrical distributions, especially those that resemble the normal distribution, fulfill statistical assumptions that form the basis of statistical significance testing in the regression model." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"Logging skewed variables also helps to reveal the patterns in the data. […] the rescaling of the variables by taking logarithms reduces the nonlinearity in the relationship and removes much of the clutter resulting from the skewed distributions on both variables; in short, the transformation helps clarify the relationship between the two variables. It also […] leads to a theoretically meaningful regression coefficient." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"Our inability to measure important factors does not mean either that we should sweep those factors under the rug or that we should give them all the weight in a decision. Some important factors in some problems can be assessed quantitatively. And even though thoughtful and imaginative efforts have sometimes turned the 'unmeasurable' into a useful number, some important factors are simply not measurable. As always, every bit of the investigator's ingenuity and good judgment must be brought into play. And, whatever un- knowns may remain, the analysis of quantitative data nonetheless can help us learn something about the world - even if it is not the whole story." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"Quantitative techniques will be more likely to illuminate if the data analyst is guided in methodological choices by a substantive understanding of the problem he or she is trying to learn about. Good procedures in data analysis involve techniques that help to (a) answer the substantive questions at hand, (b) squeeze all the relevant information out of the data, and (c) learn something new about the world." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"Random data contain no substantive effects; thus if the analysis of the random data results in some sort of effect, then we know that the analysis is producing that spurious effect, and we must be on the lookout for such artifacts when the genuine data are analyzed." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"Sometimes clusters of variables tend to vary together in the normal course of events, thereby rendering it difficult to discover the magnitude of the independent effects of the different variables in the cluster. And yet it may be most desirable, from a practical as well as scientific point of view, to disentangle correlated describing variables in order to discover more effective policies to improve conditions. Many economic indicators tend to move together in response to underlying economic and political events." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"The problem of multicollinearity involves a lack of data, a lack of information. […] Recognition of multicollinearity as a lack of information has two important consequences: (1) In order to alleviate the problem, it is necessary to collect more data - especially on the rarer combinations of the describing variables. (2) No statistical technique can go very far to remedy the problem because the fault lies basically with the data rather than the method of analysis. Multicollinearity weakens inferences based on any statistical method - regression, path analysis, causal modeling, or cross-tabulations (where the difficulty shows up as a lack of deviant cases and as near-empty cells)." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"Statistical techniques do not solve any of the common-sense difficulties about making causal inferences. Such techniques may help organize or arrange the data so that the numbers speak more clearly to the question of causality - but that is all statistical techniques can do. All the logical, theoretical, and empirical difficulties attendant to establishing a causal relationship persist no matter what type of statistical analysis is applied." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"The language of association and prediction is probably most often used because the evidence seems insufficient to justify a direct causal statement. A better practice is to state the causal hypothesis and then to present the evidence along with an assessment with respect to the causal hypothesis - instead of letting the quality of the data determine the language of the explanation." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"The logarithmic transformation serves several purposes: (1) The resulting regression coefficients sometimes have a more useful theoretical interpretation compared to a regression based on unlogged variables. (2) Badly skewed distributions - in which many of the observations are clustered together combined with a few outlying values on the scale of measurement - are transformed by taking the logarithm of the measurements so that the clustered values are spread out and the large values pulled in more toward the middle of the distribution. (3) Some of the assumptions underlying the regression model and the associated significance tests are better met when the logarithm of the measured variables is taken." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"The matching procedure often helps inform the reader what is going on in the data […] Matching has some defects, chiefly that it is difficult to do a very good job of matching in complex situations without a large number of cases. […] One limitation of matching, then, is that quite often the match is not very accurate. A second limitation is that if we want to control for more than one variable using matching procedures, the tables begin to have combinations of categories without any cases at all in them, and they become somewhat more difficult for the reader to understand." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"The use of statistical methods to analyze data does not make a study any more 'scientific', 'rigorous', or 'objective'. The purpose of quantitative analysis is not to sanctify a set of findings. Unfortunately, some studies, in the words of one critic, 'use statistics as a drunk uses a street lamp, for support rather than illumination'. Quantitative techniques will be more likely to illuminate if the data analyst is guided in methodological choices by a substantive understanding of the problem he or she is trying to learn about. Good procedures in data analysis involve techniques that help to (a) answer the substantive questions at hand, (b) squeeze all the relevant information out of the data, and (c) learn something new about the world." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"Typically, data analysis is messy, and little details clutter it. Not only confounding factors, but also deviant cases, minor problems in measurement, and ambiguous results lead to frustration and discouragement, so that more data are collected than analyzed. Neglecting or hiding the messy details of the data reduces the researcher's chances of discovering something new." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"An especially effective device for enhancing the explanatory power of time-series displays is to add spatial dimensions to the design of the graphic, so that the data are moving over space (in two or three dimensions) as well as over time. […] Occasionally graphics are belligerently multivariate, advertising the technique rather than the data." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"Clear, detailed, and thorough labeling should be used to defeat graphical distortion and ambiguity. Write out explanations of the data on the graphic itself. Label important events in the data." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"Each part of a graphic generates visual expectations about its other parts and, in the economy of graphical perception, these expectations often determine what the eye sees. Deception results from the incorrect extrapolation of visual expectations generated at one place on the graphic to other places." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"For many people the first word that comes to mind when they think about statistical charts is 'lie'. No doubt some graphics do distort the underlying data, making it hard for the viewer to learn the truth. But data graphics are no different from words in this regard, for any means of communication can be used to deceive. There is no reason to believe that graphics are especially vulnerable to exploitation by liars; in fact, most of us have pretty good graphical lie detectors that help us see right through frauds." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"Graphical excellence is the well-designed presentation of interesting data - a matter of substance, of statistics, and of design. Graphical excellence consists of complex ideas communicated with clarity, precision, and efficiency. Graphical excellence is that which gives to the viewer the greatest number of ideas in the shortest time with the least ink in the smallest space. Graphical excellence is nearly always multivariate. And graphical excellence requires telling the truth about the data." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"Graphical competence demands three quite different skills: the substantive, statistical, and artistic. Yet now most graphical work, particularly at news publications, is under the direction of but a single expertise - the artistic. Allowing artist-illustrators to control the design and content of statistical graphics is almost like allowing typographers to control the content, style, and editing of prose. Substantive and quantitative expertise must also participate in the design of data graphics, at least if statistical integrity and graphical sophistication are to be achieved." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

" In time-series displays of money, deflated and standardized units of monetary measurement are nearly always better than nominal units." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"Inept graphics also flourish because many graphic artists believe that statistics are boring and tedious. It then follows that decorated graphics must pep up, animate, and all too often exaggerate what evidence there is in the data. […] If the statistics are boring, then you've got the wrong numbers." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"Modern data graphics can do much more than simply substitute for small statistical tables. At their best, graphics are instruments for reasoning about quantitative information. Often the most effective way to describe, explore, and summarize a set of numbers even a very large set - is to look at pictures of those numbers. Furthermore, of all methods for analyzing and communicating statistical information, well-designed data graphics are usually the simplest and at the same time the most powerful." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"Nearly all those who produce graphics for mass publication are trained exclusively in the fine arts and have had little experience with the analysis of data. Such experiences are essential for achieving precision and grace in the presence of statistics. [...] Those who get ahead are those who beautified data, never mind statistical integrity." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"Of course, false graphics are still with us. Deception must always be confronted and demolished, even if lie detection is no longer at the forefront of research. Graphical excellence begins with telling the truth about the data." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"Of course, statistical graphics, just like statistical calculations, are only as good as what goes into them. An ill-specified or preposterous model or a puny data set cannot be rescued by a graphic (or by calculation), no matter how clever or fancy. A silly theory means a silly graphic." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"Relational graphics are essential to competent statistical analysis since they confront statements about cause and effect with evidence, showing how one variable affects another." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"The conditions under which many data graphics are produced - the lack of substantive and quantitative skills of the illustrators, dislike of quantitative evidence, and contempt for the intelligence of the audience-guarantee graphic mediocrity. These conditions engender graphics that (1) lie; (2) employ only the simplest designs, often unstandardized time-series based on a small handful of data points; and (3) miss the real news actually in the data." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"The interior decoration of graphics generates a lot of ink that does not tell the viewer anything new. The purpose of decoration varies - to make the graphic appear more scientific and precise, to enliven the display, to give the designer an opportunity to exercise artistic skills. Regardless of its cause, it is all non-data-ink or redundant data-ink, and it is often chartjunk."  (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"The number of information-carrying (variable) dimensions depicted should not exceed the number of dimensions in the data." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"[…] the only worse design than a pie chart is several of them, for then the viewer is asked to compare quantities located in spatial disarray both within and between pies. […] Given their low data-density and failure to order numbers along a visual dimension, pie charts should never be used." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"The problem with time-series is that the simple passage of time is not a good explanatory variable: descriptive chronology is not causal explanation. There are occasional exceptions, especially when there is a clear mechanism that drives the Y-variable." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"The representation of numbers, as physically measured on the surface of the graphic itself, should be directly proportional to the numerical quantities represented." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"The theory of the visual display of quantitative information consists of principles that generate design options and that guide choices among options. The principles should not be applied rigidly or in a peevish spirit; they are not logically or mathematically certain; and it is better to violate any principle than to place graceless or inelegant marks on paper. Most principles of design should be greeted with some skepticism, for word authority can dominate our vision, and we may come to see only though the lenses of word authority rather than with our own eyes." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"The time-series plot is the most frequently used form of graphic design. With one dimension marching along to the regular rhythm of seconds, minutes, hours, days, weeks, months, years, centuries, or millennia, the natural ordering of the time scale gives this design a strength and efficiency of interpretation found in no other graphic arrangement." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"Vigorous writing is concise. A sentence should contain no unnecessary words, a paragraph no unnecessary sentences, for the same reason that a drawing should have no unnecessary lines and a machine no unnecessary parts. This requires not that the writer make all his sentences short, or that heavoid all detail and treat his subjects only in outline, but that every word tell." (Edward R Tufte, "The Visual Display of Quantitative Information", 1983)

"At the heart of quantitative reasoning is a single question: Compared to what? Small multiple designs, multivariate and data bountiful, answer directly by visually enforcing comparisons of changes, of the differences among objects, of the scope of alternatives. For a wide range of problems in data presentation, small multiples are the best design solution." (Edward R Tufte, "Envisioning Information", 1990) 

"Confusion and clutter are failures of design, not attributes of information. And so the point is to find design strategies that reveal detail and complexity - rather than to fault the data for an excess of complication. Or, worse, to fault viewers for a lack of understanding. Among the most powerful devices for reducing noise and enriching the content of displays is the technique of layering and separation, visually stratifying various aspects of the data." (Edward R Tufte, "Envisioning Information", 1990)

"Consider this unsavory exhibit at right – chockablock with cliché and stereotype, coarse humor, and a content-empty third dimension. [...] Credibility vanishes in clouds of chartjunk; who would trust a chart that looks like a video game?" (Edward R Tufte, "Envisioning Information", 1990) [on diamond charts] 

"Gray grids almost always work well and, with a delicate line, may promote more accurate data reading and reconstruction than a heavy grid. Dark grid lines are chartjunk. When a graphic serves as a look-up table (rare indeed), then a grid may help with reading and interpolation. But even then the grid should be muted relative to the data." (Edward R Tufte, "Envisioning Information", 1990)

"Information consists of differences that make a difference." (Edward R Tufte, "Envisioning Information", 1990)

"Lurking behind chartjunk is contempt both for information and for the audience. Chartjunk promoters imagine that numbers and details are boring, dull, and tedious, requiring ornament to enliven. Cosmetic decoration, which frequently distorts the data, will never salvage an underlying lack of content. If the numbers are boring, then you've got the wrong numbers." (Edward R Tufte, "Envisioning Information", 1990)

"The ducks of information design are false escapes from flatland, adding pretend dimensions to impoverished data sets, merely fooling around with information." (Edward R Tufte, "Envisioning Information", 1990)

"Visual displays rich with data are not only an appropriate and proper complement to human capabilities, but also such designs are frequently optimal. If the visual task is contrast, comparison, and choice - as so often it is - then the more relevant information within eyespan, the better. Vacant, low-density displays, the dreaded posterization of data spread over pages and pages, require viewers to rely on visual memory - a weak skill - to make a contrast, a comparison, a choice." (Edward R Tufte, "Envisioning Information", 1990)

"We envision information in order to reason about, communicate, document, and preserve that knowledge - activities nearly always carried out on two-dimensional paper and computer screen. Escaping this flatland and enriching the density of data displays are the essential tasks of information design." (Edward R Tufte, "Envisioning Information", 1990)

"What about confusing clutter? Information overload? Doesn't data have to be ‘boiled down’ and  ‘simplified’? These common questions miss the point, for the quantity of detail is an issue completely separate from the difficulty of reading. Clutter and confusion are failures of design, not attributes of information. Often the less complex and less subtle the line, the more ambiguous and less interesting is the reading. Stripping the detail out of data is a style based on personal preference and fashion, considerations utterly indifferent to substantive content." (Edward R Tufte, "Envisioning Information", 1990)

"Good information design is clear thinking made visible, while bad design is stupidity in action." (Edward Tufte, "Visual Explanations" , 1997)

"Audience boredom is usually a content failure, not a decoration failure." (Edward R Tufte, "The cognitive style of PowerPoint", 2003)

"If your words or images are not on point, making them dance in color won't make them relevant." (Edward R Tufte, "The cognitive style of PowerPoint", 2003)

"A sparkline is a small, intense, simple, word-sized graphic with typographic resolution. Sparklines mean that graphics are no longer cartoonish special occasions with captions and boxes, but rather sparkline graphics can be everywhere a word or number can be: embedded in a sentence, table, headline, map, spreadsheet, graphic." (Edward R Tufte, "Beautiful Evidence", 2006)

"Areas surrounding data-lines may generate unintentional optical clutter. Strong frames produce melodramatic but content-diminishing visual effects. [...] A good way to assess a display for unintentional optical clutter is to ask 'Do the prominent visual effects convey relevant content?'" (Edward R Tufte, "Beautiful Evidence", 2006)

"By segregating evidence by mode (word, number, image, graph) , the current-day computer approach contradicts the spirit of sparklines, a spirit that makes no distinction among words, numbers, graphics, images. It is all evidence, after all. A good system for evidence display should be centered on evidence, not on a collection of application programs each devoted to a single mode of information." (Edward R Tufte, "Beautiful Evidence", 2006)

"By showing recent change in relation to many past changes, sparklines provide a context for nuanced analysis - and, one hopes, better decisions. [...] Sparklines efficiently display and narrate binary data (presence/absence, occurrence/non-occurrence, win/loss). [...] Sparklines can simultaneously accommodate several variables. [...] Sparklines can narrate on-going results detail for any process producing sequential binary outcomes." (Edward R Tufte, "Beautiful Evidence", 2006)

"Closely spaced lines produce moiré vibration, usually at its worst when data-lines (the figure) and spaces (the ground) between data-lines are approximately equal in size, and also when figure and ground contrast strongly in color value." (Edward R Tufte, "Beautiful Evidence", 2006)

"Conflicting with the idea of integrating evidence regardless of its these guidelines provoke several issues: First, labels are data. even intriguing data. [...] Second, when labels abandon the data points, then a code is often needed to relink names to numbers. Such codes, keys, and legends are Impediments to learning, causing the reader's brow to furrow. Third, segregating nouns from data-dots breaks up evidence on the basis of mode (verbal vs. nonverbal), a distinction lacking substantive relevance. Such separation is uncartographic; contradicting the methods of map design often causes trouble for any type of graphical display. Fourth, design strategies that reduce data-resolution take evidence displays in the wrong direction. Fifth, what clutter? Even this supposedly cluttered graph clearly shows the main ideas: brain and body mass are roughly linear in logarithms, and as both variables increase, this linearity becomes less tight." (Edward R Tufte, "Beautiful Evidence", 2006) [argumentation against Cleveland's recommendation of not using words on data plots]

"Documentation allows more effective watching, and we have the Fifth Principle for the analysis and presentation of data: 'Thoroughly describe the evidence. Provide a detailed title, indicate the authors and sponsors, document the data sources, show complete measurement scales, point out relevant issues.'" (Edward R Tufte, "Beautiful Evidence", 2006)

"Explanatory, journalistic, and scientific images should nearly always be mapped, contextualized, and placed on the universal grid. Mapped pictures combine representational images with scales, diagrams, overlays, numbers, words, images." (Edward R Tufte, "Beautiful Evidence", 2006)

"Evidence is evidence, whether words, numbers, images, din grams- still or moving. It is all information after all. For readers and viewers, the intellectual task remains constant regardless of the particular mode Of evidence: to understand and to reason about the materials at hand, and to appraise their quality, relevance. and integrity." (Edward R Tufte, "Beautiful Evidence", 2006)

"Excellent graphics exemplify the deep fundamental principles of analytical design in action. If this were not the case, then something might well be wrong with the principles." (Edward R Tufte, "Beautiful Evidence", 2006)

"Good design, however, can dispose of clutter and show all the data points and their names. [...] Clutter calls for a design solution, not a content reduction." (Edward R Tufte, "Beautiful Evidence", 2006)

"In general. statistical graphics should be moderately greater in length than in height. And, as William Cleveland discovered, for judging slopes and velocities up and down the hills in time-series, best is an aspect ratio that yields hill - slopes averaging 45°, over every cycle in the time-series. Variations in slopes are best detected when the slopes are around 45°, uphill or downhill." (Edward R Tufte, "Beautiful Evidence", 2006)

"Making a presentation is a moral act as well as an intellectual activity. The use of corrupt manipulations and blatant rhetorical ploys in a report or presentation - outright lying, flagwaving, personal attacks, setting up phony alternatives, misdirection, jargon-mongering, evading key issues, feigning disinterested objectivity, willful misunderstanding of other points of view - suggests that the presenter lacks both credibility and evidence. To maintain standards of quality, relevance, and integrity for evidence, consumers of presentations should insist that presenters be held intellectually and ethically responsible for what they show and tell. Thus consuming a presentation is also an intellectual and a moral activity." (Edward R Tufte, "Beautiful Evidence", 2006)

"Making an evidence presentation is a moral act as well as an intellectual activity. To maintain standards of quality, relevance, and integrity for evidence, consumers of presentations should insist that presenters be held intellectually and ethically responsible for what they show and tell. Thus consuming a presentation is also an intellectual and a moral activity." (Edward R Tufte, "Beautiful Evidence", 2006)

"Most techniques for displaying evidence are inherently multimodal, bringing verbal, visual. and quantitative elements together. Statistical graphics and maps arc visual-numerical fields labeled with words and framed by numbers. Even an austere image may evoke other images, new or remembered narrative, and perhaps a sense of scale and quantity. Words can simultaneously convey semantic and visual content, as the nouns on a map both name places and locate them in the two - space of latitude and longitude." (Edward R Tufte, "Beautiful Evidence", 2006)

"Principles of design should attend to the fundamental intellectual tasks in the analysis of evidence; thus we have the Second Principle for the analysis And presentation of data: Show causality, mechanism, explanation, systematic structure." (Edward R Tufte, "Beautiful Evidence", 2006)

"Sparklines are wordlike graphics, With an intensity of visual distinctions comparable to words and letters. [...] Words visually present both an overall shape and letter-by-letter detail; since most readers have seen the word previously, the visual task is usually one of quick recognition. Sparklines present an overall shape and aggregate pattern along with plenty of local detail. Sparklines are read the same way as words, although much more carefully and slowly." (Edward R Tufte, "Beautiful Evidence", 2006)

"Sparklines vastly increase the amount of data within our eyespan and intensify statistical graphics up to the everyday routine capabilities of the human eye-brain system for reasoning about visual evidence, seeing distinctions, and making comparisons. [...] Providing a straightforward and contextual look at intense evidence, sparkline graphics give us some chance to be approximately right rather than exactly wrong. (Edward R Tufte, "Beautiful Evidence", 2006)

"Sparklines work at intense resolutions, at the level of good typography and cartography. [...] Just as sparklines are like words, so then distributions of sparklines on a page are like sentences and paragraphs. The graphical idea here is make it wordlike and typographic - an idea that leads to reasonable answers for most questions about sparkline arrangements." (Edward R Tufte, "Beautiful Evidence", 2006)

"[...] the First Principle for the analysis and presentation data: 'Show comparisons, contrasts, differences'. The fundamental analytical act in statistical reasoning is to answer the question "Compared with what?". Whether we are evaluating changes over space or time, searching big data bases, adjusting and controlling for variables, designing experiments , specifying multiple regressions, or doing just about any kind of evidence-based reasoning, the essential point is to make intelligent and appropriate comparisons. Thus visual displays, if they are to assist thinking, should show comparisons." (Edward R Tufte, "Beautiful Evidence", 2006)

"The only thing that is 2-dimensional about evidence is the physical flatland of paper and computer screen. Flatlandy technologies of display encourage flatlandy thinking. Reasoning about evidence should not be stuck in 2 dimensions, for the world seek to understand is profoundly multivariate. Strategies of design should make multivariateness routine, nothing out of the ordinary. To think multivariate. show multivariate; the Third Principle for the analysis and presentation of data: 
'Show multivariate data; that is, show more than 1 or 2 variables.'" (Edward R Tufte, "Beautiful Evidence", 2006)

"The principles of analytical design are universal - like mathematics, the laws of Nature, the deep structure of language - and are not tied to any particular language, culture, style, century, gender, or technology of information display." (Edward R Tufte, "Beautiful Evidence", 2006)

"The purpose of an evidence presentation is to assist thinking. Thus presentations should be constructed so as to assist with the fundamental intellectual tasks in reasoning about evidence: describing the data, making multivariate comparisons, understanding causality, integrating a diversity Of evidence, and documenting the analysis. Thus the Grand Principle of analytical design: 'The principles of analytical design are derived from the principles of analytical thinking.' Cognitive tasks are turned into principles of evidence presentation and design." (Edward R Tufte, "Beautiful Evidence", 2006)

"The Sixth Principle for the analysis and display of data: 'Analytical presentations ultimately stand or fall depending on the quality, relevance, and integrity of their content.' This suggests that the most effective way to improve a presentation is to get better content. It also suggests that design devices and gimmicks cannot salvage failed content." (Edward R Tufte, "Beautiful Evidence", 2006)

"These little data lines, because of their active quality over time, are named sparklines - small, high-resolution graphics usually embedded in a full context of words, numbers, images. Sparklines are datawords: data-intense, design-simple, word-sized graphics." (Edward R Tufte, "Beautiful Evidence", 2006)

"Words. numbers. pictures, diagrams, graphics, charts, tables belong together. Excellent maps, which are the heart and soul of good practices in analytical graphics, routinely integrate words, numbers, line-art, grids, measurement scales. Rarely is a distinction among the different modes of evidence useful for making sound inferences. It is all information after all. Thus the Fourth Principle for the analysis and presentation of data: 'Completely integrate words, numbers, images, diagrams.'" (Edward R Tufte, "Beautiful Evidence", 2006)

30 December 2006

✏️Robert D Carlsen - Collected Quotes

"A systems analysis project is usually thought of as occurring in two separate phases [...]. The first phase involves both the study of the existing system and phase involves implementing the new or improved system. The second phase involves implementing the new or improved system. This means writing the detailed procedures and data processing programs, conducting various types of tests. and installing the new system." (Robert D Carlsen & James A Lewis, "The Systems Analysis Workbook: A complete guide to project implementation and control", 1973)

"A system is an operation or combination of operations performed by men and, possibly, machines to carry out a specific business activity. This might be a total system that considers all the factors in the entire operation of an enterprise, or it might be a subsystem of that total." (Robert D Carlsen & James A Lewis, "The Systems Analysis Workbook: A complete guide to project implementation and control", 1973)

"A systems analysis is a study of one of these systems or subsystems. The purpose is to evaluate the system in terms of one or more of the following factors ... efficiency, accuracy, timeliness, economy, and productivity ... and to design a new or improved system. The design should eliminate or minimize deficiencies and improve the overall operations. Basically, the systems analyst who performs the study is concerned with three things. First, he must consider what is currently being done. Second, he must develop a method for what should be done. Finally, he must plan for the new design's application and for implementation of the system. Systems analysis is the first step in the development of a successful automated computer system, but the results of a systems analysis do not necessarily have to result in an automated system." (Robert D Carlsen & James A Lewis, "The Systems Analysis Workbook: A complete guide to project implementation and control", 1973)

"Objectives recorded on the System Specification work sheet, even though preliminary in nature, should be specific. It is never sufficient to state an objective in terms of simply improving an existing system or of implementing a computerized system. The idea that a system or an 'automated' system is a better system has been a popular concept too long. An improved system, per se, is of no benefit to a business client; implementing a better system in order to increase profits or reduce costs is of great benefit." (Robert D Carlsen & James A Lewis, "The Systems Analysis Workbook: A complete guide to project implementation and control", 1973)

"Probably the most neglected area in systems analysis involves the planning and control of the project, especially those projects requiring automation. More than one disastrous project has been launched by 'computer people' who communicated their aims to the vexed manager using technical data processing jargon in lieu of specific lists of easily understood tasks, schedules, and costs. This problem applies equally to in-house projects or those requiring the services of outside consultants. Each project must first be planned in detail. Control is involved with comparing actual progress with the plan and taking corrective action when the two do not correspond. Without the plan, true control is not possible; the need for corrective action, its nature, extent, and urgency cannot be accurately determined." (Robert D Carlsen & James A Lewis, "The Systems Analysis Workbook: A complete guide to project implementation and control", 1973)

"Project management is the process by which it is assured that the objective is achieved and resources are not wasted. Planning is one of the two parts of project management. Control is the other." (Robert D Carlsen & James A Lewis, "The Systems Analysis Workbook: A complete guide to project implementation and control", 1973)

"The most important ingredient in any system analysis is the tody of fact on which it is based. This body of fact must be complete; it must fully descrite the system which is already in existence and the environment in which it operates. Although an essential part of it compries the forms and documents being used, these alone are not sufficient. The ultimate source of the critical facts is the people who are part of the system, the operators, the users, those who input the information, and the system mmagers. The only efficient way to obtain the required information is to ask these people; that is, to conduct a series of interviews." (Robert D Carlsen & James A Lewis, "The Systems Analysis Workbook: A complete guide to project implementation and control", 1973)

"There are basically two types of flowcharts. One is the program flowchart and the other the systems flowchart. The program flowchart. sometimes called 'logic diagram', graphically portrays the data precessing program logic. [...] Systems flowcharts display the flow of information throughout all parts of a system, including the manual portions. Systems flowcharts can be of two types. One type is task-oriented, describing the flow of data in terms of the work being performed. The other is forms-oriented, following the forms through the functional structure of the system." (Robert D Carlsen & James A Lewis, "The Systems Analysis Workbook: A complete guide to project implementation and control", 1973)

"There are several classes of flowcharts used in recording study data in the Workbook. The purpose of any chart; of course, is to clarify and to make the information more understandable. One of these types of charts is a Process Flow Chart. It concerns itself with the flow of physical materials, including documents, through a system, especially in terms of distance and time. It is most useful in analyzing some of the cost and benefit factors for existing and proposed systems. System flowcharts [...] have been called the analyst's 'shorthand'. They can be forms-oriented or task-oriented. These flowcharts are not only the primary way of recording data pertinent to the current system, but are used for developing and displaying the new system as well. Later, in the implementation phase, program flowcharts, a fundamental tool of programming, would be developed." (Robert D Carlsen & James A Lewis, "The Systems Analysis Workbook: A complete guide to project implementation and control", 1973)

"The types of graphics used in operating a business fall into three main categories: diagrams, maps, and charts. Diagrams, such as organization diagrams, flow diagrams, and networks, are usually intended to graphically portray how an activity should be, or is being, accomplished, and who is responsible for that accomplishment. Maps such as route maps, location maps, and density maps, illustrate where an activity is, or should be, taking place, and what exists there. [...] Charts such as line charts, column charts, and surface charts, are normally constructed to show the businessman how much and when. Charts have the ability to graphically display the past, present, and anticipated future of an activity. They can be plotted so as to indicate the current direction that is being followed in relationship to what should be followed. They can indicate problems and potential problems, hopefully in time for constructive corrective action to be taken." (Robert D Carlsen & Donald L Vest, "Encyclopedia of Business Charts", 1977)

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.