"The maximum number of decimal digits that can be stored by numeric and decimal datatypes. The precision includes all digits, both to the right and to the left of the decimal point." (Karen Paulsell et al, "Sybase SQL Server: Performance and Tuning Guide", 1996)
"The maximum total number of decimal digits that can be stored, both to the left and right of the decimal point." (Microsoft Corporation, "SQL Server 7.0 System Administration Training Kit", 1999)
"The degree of detail used to state a numeric quantity; for example, writing a value to two decimal places instead of five decimal places. Contrast with accuracy." (Richard D Stutzke, "Estimating Software-Intensive Systems: Projects, Products, and Processes", 2005)
"This is the total number of digits that can be stored in an object that uses the decimal datatype." (Joseph L Jorden & Dandy Weyn, "MCTS Microsoft SQL Server 2005: Implementation and Maintenance Study Guide - Exam 70-431", 2006)
"Refers to the preciseness with which a numerical quantity is expressed." (Michael Fitzgerald, "Learning Ruby", 2007)
"In a floating-point number, the number of digits to the right of the decimal point." (Jan L Harrington, "SQL Clearly Explained" 3rd Ed., 2010)
"The maximum number of significant digits that can be represented" (Nell Dale & John Lewis, "Computer Science Illuminated" 6th Ed., 2015)
"An attribute of a number that describes the total number of binary or decimal digits. An attribute of a timestamp that describes the total number of decimal digits in the fractional seconds part of the value." (Sybase, "Open Server Server-Library/C Reference Manual", 2019)
No comments:
Post a Comment