Data Quality Measure
(Redirected from Data quality)
Jump to navigation
Jump to search
A Data Quality Measure is a quality measure for a database (or database record) relative to a Data Specification.
- Context:
- It can range from being a Quantitative Data Quality Meausure to being a Qualitative Data Quality Measure.
- It can be measured by a[ Data Quality Assessment System (that supports a data quality analysis task).
- It can be defined by Data Governance System.
- It can be a Data Validity Measure.
- It can be a Data Auditability Measure.
- It can be a Data Availability Measure.
- It can be a Data Accessibility Measure.
- It can be a Data Consistency Measure.
- It can be a Data Reusability Measure.
- It can be a Data Security Measure.
- It can be a Data Timeliness Measure.
- It can be a Data Understandability Measure.
- It can be supported by a Data Verification Rule.
- It can be assessed by a Data Quality Monitoring System (against data quality specifications).
- …
- Example(s):
- Counter-Example(s):
References
2022
- (Moses et al., 2022) ⇒ Barr Moses, Lior Gavish, and Molly Vorwerck. (2022). “Data Quality Fundamentals.” O'Reilly Media, Inc..
2020
- (Wikipedia, 2020) ⇒ https://en.wikipedia.org/wiki/Data_quality#Definitions Retrieved:2020-6-3.
- … those expectations, specifications, and requirements are stated in terms of characteristics or dimensions of the data, such as:
- accessibility or availability
- accuracy or correctness
- comparability
- completeness or comprehensiveness
- consistency, coherence, or clarity
- credibility, reliability, or reputation
- relevance, pertinence, or usefulness
- timeliness or latency
- uniqueness
- validity or reasonableness
- … those expectations, specifications, and requirements are stated in terms of characteristics or dimensions of the data, such as:
2018
- (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/data_quality Retrieved:2018-11-28.
- Data quality refers to the condition of a set of values of qualitative or quantitative variables. There are many definitions of data quality but data is generally considered high quality if it is "fit for [its] intended uses in operations, decision making and planning”. Alternatively, data is deemed of high quality if it correctly represents the real-world construct to which it refers. Furthermore, apart from these definitions, as data volume increases, the question of internal data consistency becomes significant, regardless of fitness for use for any particular external purpose. People's views on data quality can often be in disagreement, even when discussing the same set of data used for the same purpose. Data cleansing may be required in order to ensure data quality.
2009
- https://talend.com/resources/what-is-data-quality/
- QUOTE: ... Data quality is the process of conditioning data to meet the specific needs of business users. Data quality initiatives are generally centered on improving data quality metrics so that data will promote optimal performance of business systems and support user faith in the systems' reliability.
Data has quality if it satisfies the requirements of intended use.
The narrow definition of data quality is that it's about data that is missing or incorrect. A broader definition is that data quality is achieved when a business uses data that is comprehensive, consistent, relevant and timely. …
- QUOTE: ... Data quality is the process of conditioning data to meet the specific needs of business users. Data quality initiatives are generally centered on improving data quality metrics so that data will promote optimal performance of business systems and support user faith in the systems' reliability.
2007
- (Herzog et al., 2007) ⇒ Thomas N. Herzog, Fritz J. Scheuren, and William E. Winkler. (2007). “Data Quality and Record Linkage Techniques.” Springer Science & Business Media,
2003
- IMF Statistics Department. (2003). “Data Quality Assessment Framework and Data Quality Program." In:
- QUOTE: ... The DQAF provides a structure for assessing data quality by comparing country statistical practices with best practices, including internationally accepted methodologies. Rooted in the United Nations Fundamental Principles of Official Statistics, it is the product of an intensive consultation with national and international statistical authorities and data users inside and outside the Fund. ...
- 1. Assurances of integrity: The principle of objectivity in the collection, processing, and dissemination of statistics is firmly adhered to. ...
- 2. Methodological soundness: The methodological basis for the statistics follows internationally accepted standards, guidelines, or good practices. ...
- 3. Accuracy and reliability: Source data and statistical techniques are sound and statistical outputs sufficiently portray reality. ...
- 4. Serviceability: Statistics, with adequate periodicity and timeliness, are consistent and follow a predictable revisions policy. ...
- 5. Accessibility: Data and metadata are easily available and assistance to users is adequate. …
- QUOTE: ... The DQAF provides a structure for assessing data quality by comparing country statistical practices with best practices, including internationally accepted methodologies. Rooted in the United Nations Fundamental Principles of Official Statistics, it is the product of an intensive consultation with national and international statistical authorities and data users inside and outside the Fund. ...