A Software Engineer and data professional's blog on SQL, data, databases, data architectures, data management, programming, Software Engineering, Project Management, ERP implementation and other IT related topics.
Pages
- 🏠Home
- 🗃️Posts
- 🗃️Definitions
- 🏭Fabric
- ⚡Power BI
- 🔢SQL Server
- 📚Data
- 📚Engineering
- 📚Management
- 📚SQL Server
- 📚Systems Thinking
- ✂...Quotes
- 🧾D365: GL
- 💸D365: AP
- 💰D365: AR
- 👥D365: HR
- ⛓️D365: SCM
- 🔤Acronyms
- 🪢Experts
- 🗃️Quotes
- 🔠Dataviz
- 🔠D365
- 🔠Fabric
- 🔠Engineering
- 🔠Management
- 🔡Glossary
- 🌐Resources
- 🏺Dataviz
- 🗺️Social
- 📅Events
- ℹ️ About
25 April 2017
⛏️Data Management: Data Products (Definitions)
12 April 2017
⛏️Data Management: Accessibility (Definitions)
"Capable of being reached, capable of being used or seen." (Martin J Eppler, "Managing Information Quality" 2nd Ed., 2006)
"The degree to which data can be obtained and used." (Danette McGilvray, "Executing Data Quality Projects", 2008)
"The opportunity to find, as well as the ease and convenience associated with locating, information. Often, this is related to the physical location of the individual seeking the information and the physical location of the information in a book or journal." (Jimmie L Joseph & David P Cook, "Medical Ethical and Policy Issues Arising from RIA", 2008)
"An inherent quality characteristic that is a measure of the ability to access data when it is required." (David C Hay, "Data Model Patterns: A Metadata Map", 2010)
"The ability to readily obtain data when needed." (DAMA International, "The DAMA Dictionary of Data Management", 2011)
"Accessibility refers to the difficulty level for users to obtain data. Accessibility is closely linked with data openness, the higher the data openness degree, the more data types obtained, and the higher the degree of accessibility." (Li Cai & Yangyong Zhu, "The Challenges of Data Quality and Data Quality Assessment in the Big Data Era", 2015) [source]
"It is the state of each user to have access to any information at any time." (ihsan Eken & Basak Gezmen, "Accessibility for Everyone in Health Communication Mobile Application Usage", 2020)
"Data accessibility measures the extent to which government data are provided in open and re-usable formats, with their associated metadata." (OECD)
⛏️Data Management: Data Virtualization (Definitions)
"The concept of letting data stay 'where it lives; and developing a hardware and software architecture that exposes the data to various business processes and organizations. The goal of virtualization is to shield developers and users from the complexity of the underlying data structures." (Jill Dyché & Evan Levy, "Customer Data Integration: Reaching a Single Version of the Truth", 2006)
"The ability to easily select and combine data fragments from many different locations dynamically and in any way into a single data structure while also maintaining its semantic accuracy." (Michael M David & Lee Fesperman, "Advanced SQL Dynamic Data Modeling and Hierarchical Processing", 2013)
"The process of retrieving and manipulating data without requiring details of how the data formatted or where the data is located" (Daniel Linstedt & W H Inmon, "Data Architecture: A Primer for the Data Scientist", 2014)
"A data integration process used to gain more insights. Usually it involves databases, applications, file systems, websites, big data techniques, and so on." (Jason Williamson, "Getting a Big Data Job For Dummies", 2015)
"Data virtualization is an approach that allows an application to retrieve and manipulate data without requiring technical details about the data, such as how it is formatted at source or where it is physically located, and can provide a single customer view (or single view of any other entity) of the overall data. Some database vendors provide a database (virtual) query layer, which is also called a data virtualization layer. This layer abstracts the database and optimizes the data for better read performance. Another reason to abstract is to intercept queries for better security. An example is Amazon Athena." (Piethein Strengholt, "Data Management at Scale", 2020)
"A data integration process in order to gain more insights. Usually it involves databases, applications, file systems, websites, big data techniques, etc.)." (Analytics Insight)
"The integration and transformation of data in real time or near real time from disparate data sources in multicloud and hybrid cloud, to support business intelligence, reporting, analytics, and other workloads." (Forrester)
⛏️Data Management: Data Lineage (Definitions)
"A mechanism for recording information to determine the source of any piece of data, and the transformations applied to that data using Data Transformation Services (DTS). Data lineage can be tracked at the package and row levels of a table and provides a complete audit trail for information stored in a data warehouse. Data lineage is available only for packages stored in Microsoft Repository." (Microsoft Corporation, "SQL Server 7.0 System Administration Training Kit", 1999)
"This information is used by Data Transformation Services (DTS) when it works in conjunction with Meta Data Services. This information records the history of package execution and data transformations for each piece of data." (Anthony Sequeira & Brian Alderman, "The SQL Server 2000 Book", 2003)
"This is also called data provenance. It deals with the origin of data; it is all about documenting where data is, how it has been derived, and how it flows so you can manage and secure it appropriately as it is further processed by applications." (Martin Oberhofer et al, "Enterprise Master Data Management", 2008)
"This provides the functionality to determine where data comes from, how it is transformed, and where it is going. Data lineage metadata traces the lifecycle of information between systems, including the operations that are performed on the data." (Martin Oberhofer et al, "The Art of Enterprise Information Architecture", 2010)
"Data lineage refers to a set of identifiable points that can be used to understand details of data movement and transformation (e.g., transactional source field names, file names, data processing job names, programming rules, target table fields). Lineage describes the movement of data through systems from its origin or provenance to its use in a particular application. Lineage is related to both the data chain and the information life cycle. Most people concerned with the lineage of data want to understand two aspects of it: the data’s origin and the ways in which the data has changed since it was originally created. Change can take place within one system or between systems." (Laura Sebastian-Coleman, "Measuring Data Quality for Ongoing Improvement ", 2012)
06 April 2017
⛏️Data Management: Data Mesh (Definitions)
"Data Mesh is a sociotechnical approach to share, access and manage analytical data in complex and large-scale environments - within or across organizations." (Zhamak Dehghani, "Data Mesh: Delivering Data-Driven Value at Scale", 2021)
"A data mesh is an architectural concept in data engineering that gives business domains (divisions/departments) within a large organization ownership of the data they produce. The centralized data management team then becomes the organization’s data governance team." (Margaret Rouse, 2023) [source]
"Data Mesh is a design concept based on federated data and business domains. It applies product management thinking to data management with the outcome being Data Products. It’s technology agnostic and calls for a domain-centric organization with federated Data Governance." (Sonia Mezzetta, "Principles of Data Fabric", 2023)
"A data mesh is a decentralized data architecture with four specific characteristics. First, it requires independent teams within designated domains to own their analytical data. Second, in a data mesh, data is treated and served as a product to help the data consumer to discover, trust, and utilize it for whatever purpose they like. Third, it relies on automated infrastructure provisioning. And fourth, it uses governance to ensure that all the independent data products are secure and follow global rules."(James Serra, "Deciphering Data Architectures", 2024)
"A data mesh is a federated data architecture that emphasizes decentralizing data across business functions or domains such as marketing, sales, human resources, and more. It facilitates organizing and managing data in a logical way to facilitate the more targeted and efficient use and governance of the data across organizations." (Arshad Ali & Bradley Schacht, "Learn Microsoft Fabric", 2024)
"To explain a data mesh in one sentence, a data mesh is a centrally managed network of decentralized data products. The data mesh breaks the central data lake into decentralized islands of data that are owned by the teams that generate the data. The data mesh architecture proposes that data be treated like a product, with each team producing its own data/output using its own choice of tools arranged in an architecture that works for them. This team completely owns the data/output they produce and exposes it for others to consume in a way they deem fit for their data." (Aniruddha Deswandikar,"Engineering Data Mesh in Azure Cloud", 2024)
"A data mesh is a decentralized data architecture that organizes data by a specific business domain - for example, marketing, sales, customer service and more - to provide more ownership to the producers of a given data set." (IBM) [source]
"A data mesh is a new approach to designing data architectures. It takes a decentralized approach to data storage and management, having individual business domains retain ownership over their datasets rather than flowing all of an organization’s data into a centrally owned data lake." (Alteryx) [source]
"A Data Mesh is a solution architecture for the specific goal of building business-focused data products without preference or specification of the technology involved." (Gartner)
"A data mesh is an architectural framework that solves advanced data security challenges through distributed, decentralized ownership." (AWS) [source]
"Data mesh defines a platform architecture based on a decentralized network. The data mesh distributes data ownership and allows domain-specific teams to manage data independently." (TIBCO) [source]
"Data mesh refers to a data architecture where data is owned and managed by the teams that use it. A data mesh decentralizes data ownership to business domains–such as finance, marketing, and sales–and provides them a self-serve data platform and federated computational governance." (Qlik) [source]
05 April 2017
⛏️Data Management: Quality (Just the Quotes)
"Quality is never an accident; it is always the result of intelligent effort." (John Ruskin, "Seven Lamps of Architecture", 1849)
"It is most important that top management be quality-minded. In the absence of sincere manifestation of interest at the top, little will happen below." (Joseph M Juran, "Management of Inspection and Quality Control", 1945)
"Data are of high quality if they are fit for their intended use in operations, decision-making, and planning." (Joseph M Juran, 1964)
"When a product is manufactured by workers who find their work meaningful, it will inevitably be a product of high quality." (Pehr G Gyllenhammar, "Management", 1976)
"Quality management is a systematic way of guaranteeing that organized activities happen the way they are planned." (Philip B Crosby, "Quality Is Free: The Art of Making Quality Certain", 1977)
"The problem of quality management is not what people don't know about it. The problem is what they think they do know." (Philip B Crosby, "Quality Is Free: The Art of Making Quality Certain", 1977)
"Uncontrolled variation is the enemy of quality." (W Edwards Deming, 1980)
"Almost all quality improvement comes via simplification of design, manufacturing, layout, processes and procedures." (Tom Peters, "Thriving on Chaos", 1987)
"Quality is a matter of faith. You set your standards, and you have to stick by them no matter what. That's easy when you've got plenty of product on hand, but it's another thing when the freezer is empty and you've got a truck at the door waiting for the next shipment to come off the production line. That's when you really earn your reputation for quality." (Ben Cohen, Inc. Magazine, 1987)
"Quality is very simple. So simple, in fact, that it is difficult for people to understand." (Roger Hale, "Quest for Quality", 1987)
"[...] running numbers on a computer [is] easier than trying to judge quality." (Esther Dyson, Forbes, 1987)
"The [quality control] issue has more to do with people and motivation and less to do with capital and equipment than one would think. It involves a cultural change." (Michael Beer, The Washington Post, 1987)
"Cutting costs without improvements in quality is futile." (W Edwards Deming, Forbes, 1988)
"Quality planning consists of developing the products and processes required to meet customer's needs." (Joseph M Juran, "Juran on planning for quality", 1988)
"Quality means meeting customers' (agreed) requirements, formal and informal, at lowest cost, first time every time." (Robert L Flood, "Beyond TQM", 1993)
"Many quality failures arise because a customer uses the product in a manner different from that intended by the supplier." (Joseph M Juran, "The quality planning process", 1999)
"Quality goals that affect product salability should be based primarily on meeting or exceeding market quality. Because the market and the competition undoubtedly will be changing while the quality planning project is under way, goals should be set so as to meet or beat the competition estimated to be prevailing when the project is completed." (Joseph M Juran, "The quality planning process", 1999)
"'Quality' means freedom from deficiencies - freedom from errors that require doing work over again (rework) or that result in field failures, customer dissatisfaction, customer claims, and so on."
"‘Quality’ means those features of products which meet customer needs and thereby provide customer satisfaction." (Joseph M Juran, "How to think about quality", 1999)
"The anatomy of 'quality assurance' is very similar to that of quality control. Each evaluates actual quality. Each compares actual quality with the quality goal. Each stimulates corrective action as needed. What differs is the prime purpose to be served. Under quality control, the prime purpose is to serve those who are directly responsible for conducting operations - to help them regulate current operations. Under quality assurance, the prime purpose is to serve those who are not directly responsible for conducting operations but who have a need to know - to be informed as to the state of affairs and, hopefully, to be assured that all is well."
"To attain quality, it is well to begin by establishing the 'vision' for the organization, along with policies and goals. Conversion of goals into results (making quality happen) is then done through managerial processes - sequences of activities that produce the intended results."
"Our culture, obsessed with numbers, has given us the idea that what we can measure is more important than what we can't measure. Think about that for a minute. It means that we make quantity more important than quality." (Donella Meadows, "Thinking in Systems: A Primer", 2008)
"A model is a representation in that it (or its properties) is chosen to stand for some other entity (or its properties), known as the target system. A model is a tool in that it is used in the service of particular goals or purposes; typically these purposes involve answering some limited range of questions about the target system." (Wendy S Parker, "Confirmation and Adequacy-for-Purpose in Climate Modelling", Proceedings of the Aristotelian Society, Supplementary Volumes, Vol. 83, 2009)
20 March 2017
⛏️Data Management: Data Structure (Definitions)
"A logical relationship among data elements that is designed to support specific data manipulation functions (trees, lists, and tables)." (William H Inmon, "Building the Data Warehouse", 2005)
"Data stored in a computer in a way that (usually) allows efficient retrieval of the data. Arrays and hashes are examples of data structures." (Michael Fitzgerald, "Learning Ruby", 2007)
"A data structure in computer science is a way of storing data to be used efficiently." (Sahar Shabanah, "Computer Games for Algorithm Learning", 2011)
"Data structure is a general term referring to how data is organized. In modeling, it refers more specifically to the model itself. Tables are referred to as 'structures'." (Laura Sebastian-Coleman, "Measuring Data Quality for Ongoing Improvement ", 2012)
[probabilistic *] "A data structure which exploits randomness to boost its efficiency, for example skip lists and Bloom filters. In the case of Bloom filters, the results of certain operations may be incorrect with a small probability." (Wei-Chih Huang & William J Knottenbelt, "Low-Overhead Development of Scalable Resource-Efficient Software Systems", 2014)
"A collection of methods for storing and organizing sets of data in order to facilitate access to them. More formally data structures are concise implementations of abstract data types, where an abstract data type is a set of objects together with a collection of operations on the elements of the set." (Ioannis Kouris et al, "Indexing and Compressing Text", 2015)
"A representation of the logical relationship between elements of data." (Adam Gordon, "Official (ISC)2 Guide to the CISSP CBK" 4th Ed., 2015)
"Is a schematic organization of data and relationship to express a reality of interest, usually represented in a diagrammatic form." (Maria T Artese Isabella Gagliardi, "UNESCO Intangible Cultural Heritage Management on the Web", 2015)
"The implementation of a composite data field in an abstract data type" (Nell Dale & John Lewis, "Computer Science Illuminated" 6th Ed., 2015)
"A way of organizing data so that it can be efficiently accessed and updated." (Vasileios Zois et al, "Querying of Time Series for Big Data Analytics", 2016)
"A particular way of storing information, allowing to a high level approach on the software implementation." (Katia Tannous & Fillipe de Souza Silva, "Particle Shape Analysis Using Digital Image Processing", 2018)
"It is a particular way of organizing data in a computer so that they can be used efficiently." (Edgar C Franco et al, "Implementation of an Intelligent Model Based on Machine Learning in the Application of Macro-Ergonomic Methods...", 2019)
"Way information is represented and stored." (Shalin Hai-Jew, "Methods for Analyzing and Leveraging Online Learning Data", 2019)
"A physical or logical relationship among a collection of data elements." (IEEE 610.5-1990)
⛏️Data Management: Data Sharing (Definitions)
"The ability to share individual pieces of data transparently from a database across different applications." (Microsoft Corporation, "SQL Server 7.0 System Administration Training Kit", 1999)
"Exchange of data and/or meta-data in a situation involving the use of open, freely available data formats, where process patterns are known and standard, and where not limited by privacy and confidentiality regulations." (DAMA International, "The DAMA Dictionary of Data Management", 2011)
"Data sharing involves one entity sending data to another entity, usually with the understanding that the other entity will store and use the data. This process may involve free or purchased data, and it may be done willingly, or in compliance with regulations, laws, or court orders." (Jules H Berman, "Principles of Big Data: Preparing, Sharing, and Analyzing Complex Information", 2013)
"The ability of subsystems or application programs to access data directly and to change it while maintaining data integrity." (Sybase, "Open Server Server-Library/C Reference Manual", 2019)
"The ability of two or more DB2 subsystems to directly access and change a single set of data." (BMC)
⛏️Data Management: Information Overload (Definitions)
"A state in which information can no longer be internalized productively by the individual due to time constraints or the large volume of received information." (Martin J Eppler, "Managing Information Quality" 2nd Ed., 2006)
"Phenomena related to the inability to absorb and manage effectively large amounts of information, creating inefficiencies, stress, and frustration. It has been exacerbated by advances in the generation, storage, and electronic communication of information." (Glenn J Myatt, "Making Sense of Data: A Practical Guide to Exploratory Data Analysis and Data Mining", 2006)
"A situation where relevant information becomes buried in a mass of irrelevant information" (Josep C Morales, "Information Disasters in Networked Organizations", 2008)
"A situation where individuals have access to so much information that it becomes impossible for them to function effectively, sometimes leading to where nothing gets done and the user gives the impression of being a rabbit caught in the glare of car headlights." Alan Pritchard, "Information-Rich Learning Concepts", 2009)
"is the situation when the information processing requirements exceed the information processing capacities." (Jeroen ter Heerdt & Tanya Bondarouk, "Information Overload in the New World of Work: Qualitative Study into the Reasons", 2009)
"Refers to an excess amount of information, making it difficult for individuals to effectively absorb and use information; increases the likelihood of poor decisions." (Leslie G Eldenburg & Susan K Wolcott, "Cost Management" 2nd Ed., 2011)
"The inability to cope with or process ever-growing amounts of data into our lives." (Linda Volonino & Efraim Turban, "Information Technology for Management" 8th Ed., 2011)
"The state where the rate or amount of input to a system or person outstrips the capacity or speed of processing that input successfully." (DAMA International, "The DAMA Dictionary of Data Management", 2011)
19 March 2017
⛏️Data Management: Encryption (Definitions)
"A method for keeping sensitive information confidential by changing data into an unreadable form." (Microsoft Corporation, "SQL Server 7.0 System Administration Training Kit", 1999)
"The encoding of data so that the plain text is transformed into something unintelligible, called cipher text." (Tom Petrocelli, "Data Protection and Information Lifecycle Management", 2005)
"Reordering of bits of data to make it unintelligible (and therefore useless) to an unauthorized third party, while still enabling the authorized user to use the data after the reverse process of decryption." (David G Hill, "Data Protection: Governance, Risk Management, and Compliance", 2009)
"To transform information from readable plain text to unreadable cipher text to prevent unintended recipients from reading the data." (Janice M Roehl-Anderson, "IT Best Practices for Financial Managers", 2010)
"The process of transforming data using an algorithm (called a cipher) to make it unreadable to anyone except those possessing special knowledge, usually referred to as a key." (Craig S Mullins, "Database Administration", 2012)
"The process of converting readable data (plaintext) into a coded form (ciphertext) to prevent it from being read by an unauthorized party." (Microsoft, "SQL Server 2012 Glossary", 2012)
"The cryptographic transformation of data to produce ciphertext." (Manish Agrawal, "Information Security and IT Risk Management", 2014)
"The process of scrambling data in such a way that it is unreadable by unauthorized users but can be unscrambled by authorized users to be readable again." (Weiss, "Auditing IT Infrastructures for Compliance, 2nd Ed", 2015)
"The transformation of plaintext into unreadable ciphertext." (Shon Harris & Fernando Maymi, "CISSP All-in-One Exam Guide, 8th Ed", 2018)
"In computer security, the process of transforming data into an unintelligible form in such a way that the original data either cannot be obtained or can be obtained only by using a decryption process." (Sybase, "Open Server Server-Library/C Reference Manual", 2019)
"Encryption is about translating the data into complex codes that cannot be interpreted (decrypted) without the use of a decryption key. These keys are typically distributed and stored separately. There are two types of encryption: symmetric key encryption and public key encryption. In symmetric key encryption, the key to both encrypt and decrypt is exactly the same. Public key encryption has two different keys. One key is used to encrypt the values (the public key), and one key is used to decrypt the data (the private key)." (Piethein Strengholt, "Data Management at Scale", 2020)
"The process of encoding data in such a way to prevent unauthorized access." (AICPA)
15 March 2017
⛏️Data Management: Data Conversion (Definitions)
"The function to translate data from one format to another" (Yang Xiang & Daxin Tian, "Multi-Core Supported Deep Packet Inspection", 2010)
"1.In systems, the migration from the use of one application to another. 2.In data management, the process of preparing, reengineering, cleansing and transforming data and loading it into a new target data structure. Typically, the term is used to describe a one-time event as part of a new database implementation. However, it is sometimes used to describe an ongoing operational procedure." (DAMA International, "The DAMA Dictionary of Data Management", 2011)
"(1)The process of changing data structure, format, or contents to comply with some rule or measurement requirement. (2)The process of changing data contents stored in one system so that it can be stored in another system, or used by an application." (DAMA International, "The DAMA Dictionary of Data Management", 2011)
"The process of automatically reading data in one file format and emitting the same data in a different format, thus making the data accessible to a wider range of applications." (Open Data Handbook)
"To change data from one form of representation to another; for example, to convert data from an ASCII representation to an EBCDIC representation." (IEEE 610.5-1990)
⛏️Data Management: Data Compression (Definitions)
"any kind of data reduction method that preserves the application-specific information." (Teuvo Kohonen, "Self-Organizing Maps 3rd Ed.", 2001)
"The process of reducing the size of data by use of mathematical algorithms." (Tom Petrocelli, "Data Protection and Information Lifecycle Management", 2005)
"1.Algorithms or techniques that change data to a smaller physical size that contains the same information. 2.The process of changing data to be stored in a smaller physical or logical space." (DAMA International, "The DAMA Dictionary of Data Management", 2011)
"Encoding information in such a way that its representation consumes less space in memory" (Hasso Plattner, "A Course in In-Memory Data Management: The Inner Mechanics of In-Memory Databases 2nd Ed.", 2014)
"Compression is a data management technique that uses repeating patterns in data to reduce the storage needed to hold the data. A compression algorithm for databases should perform compression and decompression operations as fast as possible. This often entails a trade-off between the speed of compression/decompression and the size of the compressed data. Faster compression algorithms can lead to larger compressed data than other, slower algorithms." (Dan Sullivan, "NoSQL for Mere Mortals®", 2015)
"Reducing the amount of space needed to store a piece of data" (Nell Dale & John Lewis, "Computer Science Illuminated" 6th Ed., 2015)
"The process of reducing the size of a data file by encoding information using fewer bits than the original file." (Faithe Wempen, "Computing Fundamentals: Introduction to Computers", 2015)
"A method that reduces the amount of space needed for storing data. See also client compression and hardware compression." (CommVault, "Documentation 11.20", 2018)
"Any technique used to reduce the amount of storage required to store data." (IEEE 610.5-1990)
14 March 2017
⛏️Data Management: Data Protection (Definitions)
"The protecting of data from damage, destruction, and unauthorized alteration." (Tom Petrocelli, "Data Protection and Information Lifecycle Management", 2005)
"Deals with issues such as data security, privacy, and availability. Data protection controls are required by regulations and industry mandates such as Sarbanes-Oxley, European Data Protection Law, and others." (Allen Dreibelbis et al, "Enterprise Master Data Management", 2008)
"A set of rules that aim to protect the rights, freedoms and interests of individuals when information related to them is being processed." (Maria Tzanou, "Data Protection in EU Law after Lisbon: Challenges, Developments, and Limitations", 2015)
"An umbrella term for various procedures that ensure information is secure and available only to authorized users." (Peter Sasvari & Zoltán Nagymate, "The Empirical Analysis of Cloud Computing Services among the Hungarian Enterprises", 2015)
"Protection of the data against unauthorized access by third parties as well as protection of personal data (such as customer data) in the processing of data according to the applicable legal provisions." (Boris Otto & Hubert Österle, "Corporate Data Quality", 2015)
"Legal control over access to, and use of, data in computers." (Lucy Self & Petros Chamakiotis, "Understanding Cloud Computing in a Higher Education Context", 2018)
"Data protection is a task of safeguarding personal or sensitive data which are complex and widely distributed." (M Fevzi Esen & Eda Kocabas, "Personal Data Privacy and Protection in the Meeting, Incentive, Convention, and Exhibition (MICE) Industry", 2019)
"Process of protecting important information from corruption, compromise, or loss." (Patrícia C T Gonçalves, "Medical Social Networks, Epidemiology and Health Systems", 2021)
"The process involving use of laws to protect data of individuals from unauthorized disclosure or access." (Frank Makoza, "Learning From Abroad on SIM Card Registration Policy: The Case of Malawi", 2019)
"Is the process in information and communication technology that deals with the ability an organization or individual to safeguard data and information from corruption, theft, compromise, or loss." (Valerianus Hashiyana et al, "Integrated Big Data E-Healthcare Solutions to a Fragmented Health Information System in Namibia", 2021)
"The mechanisms with which an organization enables individuals to retain control of the personal data they willingly share, where security provides policies, controls, protocols, and technologies necessary to fulfill rules and obligations in accordance with privacy regulations, industry standards, and the organization's ethics and social responsibility." (Forrester)
06 March 2017
⛏️Data Management: Audit Trail (Definitions)
"Audit records stored in the sybsecurity database." (Karen Paulsell et al, "Sybase SQL Server: Performance and Tuning Guide", 1996)
"A record of what happened to data from its inception to its current state. Audit trails help verify the integrity of data." (Microsoft Corporation, "Microsoft SQL Server 7.0 Data Warehouse Training Kit", 2000)
"Data maintained to trace activity, such as a transaction log, for purposes of recovery or audit." (Craig S Mullins, "Database Administration", 2012)
"A chronological record of activities on information resources that enables the reconstruction and examination of sequences of activities on those information resources for later review." (Mark Rhodes-Ousley, "Information Security: The Complete Reference, Second Edition" 2nd Ed., 2013)
"A trace of a sequence of events in a clerical or computer system. This audit usually identifies the creation or modification of any element in the system, who did it, and (possibly) why it was done." (Marcia Kaufman et al, "Big Data For Dummies", 2013)
"A chronological record of events or transactions. An audit trail is used for examining or reconstructing a sequence of events or transactions, managing security, and recovering lost transactions." (Sybase, "Open Server Server-Library/C Reference Manual", 2019)
"A path by which the original input to a process (e.g. data) can be traced back through the process, taking the process output as a starting point. This facilitates result checking and allows a process audit to be carried out [after TMap]." (Software Quality Assurance)
05 March 2017
⛏️Data Management: System of Record (Definitions)
"The system that definitively specifies data values. In dealing with redundant data, you can have values that should be the same but disagree. The system of record is the system you go back to, in order to verify the true value of the data." (Microsoft Corporation, "Microsoft SQL Server 7.0 Data Warehouse Training Kit", 2000)
"The definitive and singular source of operational data. If data element ABC has a value of 25 in a database record but a value of 45 in the system of record, by definition, the first value is incorrect and must be reconciled. The system of record is useful for managing redundancy of data." (William H Inmon, "Building the Data Warehouse", 2005)
"The single authoritative, enterprise-designated source of operational data. It is the most current, accurate source of its data." (David Lyle & John G Schmidt, "Lean Integration", 2010)
"A system that stores the 'official' version of a data attribute." (DAMA International, "The DAMA Dictionary of Data Management", 2011)
"The system of record is a system that is charged with keeping the most complete or trustworthy representation of a set of entities. Within the practice of master data management, such representations are referred to as golden records and the system of record can also be called the system of truth." (Laura Sebastian-Coleman, "Measuring Data Quality for Ongoing Improvement ", 2012)
"Records from which information is retrieved by the name, identifying number, symbol, or other identifying particular assigned to the individual. Sometimes abbreviated as SOR." ( Manish Agrawal, "Information Security and IT Risk Management", 2014)
"An information storage system (commonly implemented on a computer system) that is the authoritative data source for a given data element or piece of information. The need to identify systems of record can become acute in organizations where management information systems have been built by taking output data from multiple-source systems, reprocessing this data, and then re-presenting the result for a new business use." (Janice M Roehl-Anderson, "IT Best Practices for Financial Managers", 2010)
About Me
- Adrian
- Koeln, NRW, Germany
- IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.