01 July 2009

DBMS: Normalization (Definitions)

"Normalization is the database design process of discarding repeating groups, minimizing redundancy, eliminating composite keys for partial dependency, and separating non-key attributes. Various levels of normalization and various rules or tests have been formalized for performing normalization." (Microsoft Corporation, "Microsoft SQL Server 7.0 Data Warehouse Training Kit", 2000)

"The process of transforming database designs into logical structures by following rules and principles of relational database theory. Different 'normal forms' exist, each further reducing both redundancy and the possibility of update anomalies. 'Third normal form' is a design in which all the attributes of each row 'depend on the key, the whole key, and nothing but the key'." (Bill Pribyl & Steven Feuerstein, "Learning Oracle PL/SQL", 2001)

"The process of designing a database so that its tables follow the rules specified by relational theory. In practice, this usually means that all database tables are in third normal form." (Peter Gulutzan & Trudy Pelzer, "SQL Performance Tuning", 2002)

"Normalization is a method for ensuring that the data model meets the objectives of accuracy, consistency, simplicity, nonredundancy, and stability. It is a physical database design technique that applies mathematical rules to the relational data model to identify and reduce insertion, updating, or deletion anomalies." (Claudia Imhoff et al, "Mastering Data Warehouse Design", 2003)

"A formal approach in data modeling that examines and validates attributes and their entities in the Logical data model. The purpose of data normalization is to ensure that each attribute belongs to the entity to which it has been assigned, that redundant storage of information is minimized, and that storage anomalies are eliminated." (Sharon Allen & Evan Terry, "Beginning Relational Data Modeling 2nd Ed.", 2005)

"Developed by Dr. E. F. Codd in 1970, database normalization is the process of simplifying data and database design to achieve maximum performance and simplicity. This process involves the removing of useless and redundant data." (Thomas Moore, "EXAM CRAM™ 2: Designing and Implementing Databases with SQL Server 2000 Enterprise Edition", 2005)

"A process by which a relational schema design is adjusted to reduce the possibility of storing data redundantly. As a schema is normalized, attributes that contain repeating values are moved into new tables and replaced by a foreign key. This process requires analyzing and understanding the dependencies among attributes and key columns. There are several degrees of normalization, which formally describe the extent to which redundancies have been removed. Third normal form (3NF) is widely accepted as the optimal relational design for a transaction system. A star schema design is often referred to as denormalized, although it is actually in second normal form." (Christopher Adamson, "Mastering Data Warehouse Aggregates", 2006)

"The organization of data to reduce redundancy by creating many linked tables so that a value is stored in only one place." (Reed Jacobsen & Stacia Misner, "Microsoft SQL Server 2005 Analysis Services Step by Step", 2006)

"The process of simplifying the structure of data. Normalization increases granularity and Granularity is the scope of a definition for any particular thing. The more granular a data model is, the easier it becomes to manage, up to a point, depending, of course, on the application of the database model." (Gavin Powell, "Beginning Database Design", 2006)

"A formal process of removing redundancy from a database design by separating it into children tables from the parent table." (Victor Isakov et al, "MCITP Administrator: Microsoft SQL Server 2005 Optimization and Maintenance (70-444) Study Guide", 2007)

"Logical design process in which data is separated into multiple, related tables. The process allows databases to perform optimally." (Sara Morganand & Tobias Thernstrom , "MCITP Self-Paced Training Kit : Designing and Optimizing Data Access by Using Microsoft SQL Server 2005 - Exam 70-442", 2007)

"The design process for generating entity specifications to minimize both data redundancy and update anomalies." (S. Sumathi & S. Esakkirajan, "Fundamentals of Relational Database Management Systems", 2007)

"A series of database design recommendations that dictate how information should be dispersed among tables as well as how these tables should relate." (Robert D. Schneider and Darril Gibson, "Microsoft SQL Server 2008 All-In-One Desk Reference For Dummies", 2008)

"The process of transforming the database's structure to minimize the changes of certain kinds of data anomalies." (Rod Stephens, "Beginning Database Design Solutions", 2008)

"The process of designing relations to adhere to increasingly stringent sets of rules to avoid problems with poor database design." (Jan L Harrington, "Relational Database Design and Implementation" 3rd Ed., 2009)

"The process of breaking up a table into smaller tables to eliminate problems with unwanted loss of data (the egregious side effects of losing data integrity) from the deletion of records and inefficiencies associated with multiple data updates." (Toby J Teorey, ", Database Modeling and Design" 4th Ed., 2010)

"The process, originally articulated by Dr. E. F. Codd in his relational theory, for organizing data to reduce redundancy to the minimum possible. It involves guaranteeing that each attribute in a 'relation' (table or entity class) is truly an attribute of that relation and none other. The process involves organizing data to follow the constraints of at least first normal form, second normal form, and third normal form. Additional value is found in Boyce-Codd normal form, fourth normal form, and fifth normal form." (David C Hay, "Data Model Patterns: A Metadata Map", 2010)

"A process that assigns attributes to entities in such a way that data redundancies are reduced or eliminated." (Carlos Coronel et al, "Database Systems: Design, Implementation, and Management 9th Ed", 2011)

"The process of organizing data to minimize redundancy and remove ambiguity. In simple terms, normalization is the process of identifying the one best place each fact belongs." (Craig S Mullins, "Database Administration", 2012)

"The process of organizing data at its detailed level into according to its existence criteria" (Daniel Linstedt & W H Inmon, "Data Architecture: A Primer for the Data Scientist", 2014)

"The process of restructuring a data model by reducing its relations to their simplest forms. It is a key step in the task of building a logical relational database design. Normalization helps avoid redundancies and inconsistencies in data. An entity is normalized if it meets a set of constraints for a particular normal form (first normal form, second normal form, and so on)." (Sybase, "Open Server Server-Library/C Reference Manual", 2019)

No comments:

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.