Data Quality Measure

From GM-RKB
(Redirected from Data Quality Metric)
Jump to navigation Jump to search

A Data Quality Measure is a quality measure for a database (or database record) relative to a Data Specification.



References

2022

2020

  • (Wikipedia, 2020) ⇒ https://en.wikipedia.org/wiki/Data_quality#Definitions Retrieved:2020-6-3.
    • … those expectations, specifications, and requirements are stated in terms of characteristics or dimensions of the data, such as:
      • accessibility or availability
      • accuracy or correctness
      • comparability
      • completeness or comprehensiveness
      • consistency, coherence, or clarity
      • credibility, reliability, or reputation
      • relevance, pertinence, or usefulness
      • timeliness or latency
      • uniqueness
      • validity or reasonableness

2018

  • (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/data_quality Retrieved:2018-11-28.
    • Data quality refers to the condition of a set of values of qualitative or quantitative variables. There are many definitions of data quality but data is generally considered high quality if it is "fit for [its] intended uses in operations, decision making and planning”. Alternatively, data is deemed of high quality if it correctly represents the real-world construct to which it refers. Furthermore, apart from these definitions, as data volume increases, the question of internal data consistency becomes significant, regardless of fitness for use for any particular external purpose. People's views on data quality can often be in disagreement, even when discussing the same set of data used for the same purpose. Data cleansing may be required in order to ensure data quality.


2009

  • https://talend.com/resources/what-is-data-quality/
    • QUOTE: ... Data quality is the process of conditioning data to meet the specific needs of business users. Data quality initiatives are generally centered on improving data quality metrics so that data will promote optimal performance of business systems and support user faith in the systems' reliability.

      Data has quality if it satisfies the requirements of intended use.

      The narrow definition of data quality is that it's about data that is missing or incorrect. A broader definition is that data quality is achieved when a business uses data that is comprehensive, consistent, relevant and timely. …

2007

2003

  • IMF Statistics Department. (2003). “Data Quality Assessment Framework and Data Quality Program." In:
    • QUOTE: ... The DQAF provides a structure for assessing data quality by comparing country statistical practices with best practices, including internationally accepted methodologies. Rooted in the United Nations Fundamental Principles of Official Statistics, it is the product of an intensive consultation with national and international statistical authorities and data users inside and outside the Fund. ...
      1. Assurances of integrity: The principle of objectivity in the collection, processing, and dissemination of statistics is firmly adhered to. ...
      2. Methodological soundness: The methodological basis for the statistics follows internationally accepted standards, guidelines, or good practices. ...
      3. Accuracy and reliability: Source data and statistical techniques are sound and statistical outputs sufficiently portray reality. ...
      4. Serviceability: Statistics, with adequate periodicity and timeliness, are consistent and follow a predictable revisions policy. ...
      5. Accessibility: Data and metadata are easily available and assistance to users is adequate. …