Database Cleaning Task
A Database Cleaning Task is a data processing task that requires the detection and removal of erroneous data items.
- AKA: Data Scrubbing/Cleansing.
- Context:
- Input: Data Records.
- output: Data Records.
- Performance Metric: Data Quality.
- It can be solved by a Data Cleaning System (that implements a data cleaning algorithm).
- It can support a Data Transformation capability.
- It can involve filtering, merging, decoding, and translating source data to create validated data.
- It can be supported by a Record Resolution Task and a Record Deduplication Task.
- …
- Example(s):
- Person Age = 207 years, detection of inaccurate data.
- …
- Counter-Example(s):
- See: Source Data, Data Normalization, Information Integration.
References
2017
- https://hbr.org/2017/02/how-chief-data-officers-can-get-their-companies-to-collect-clean-data
- QUOTE: In analytics, nothing matters more than data quality. The practical way to control data quality is to do it at the point where the data is created. Cleaning up data downstream is expensive and not scalable, because data is a byproduct of business processes and operations like marketing, sales, plant operations, and so on. But controlling data quality at the point of creation requires a change in the behaviors of those creating the data and the IT tools they use.
2014
- (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/Data_cleansing Retrieved:2014-8-3.
- Data cleansing, data cleaning or data scrubbing is the process of detecting and correcting (or removing) corrupt or inaccurate records from a record set, table, or database. Used mainly in databases, the term refers to identifying incomplete, incorrect, inaccurate, irrelevant, etc. parts of the data and then replacing, modifying, or deleting this dirty data or coarse data.
After cleansing, a data set will be consistent with other similar data sets in the system. The inconsistencies detected or removed may have been originally caused by user entry errors, by corruption in transmission or storage, or by different data dictionary definitions of similar entities in different stores.
Data cleansing differs from data validation in that validation almost invariably means data is rejected from the system at entry and is performed at entry time, rather than on batches of data.
The actual process of data cleansing may involve removing typographical errors or validating and correcting values against a known list of entities. The validation may be strict (such as rejecting any address that does not have a valid postal code) or fuzzy (such as correcting records that partially match existing, known records).
Some data cleansing solutions will clean data by cross checking with a validated data set. Also data enhancement, where data is made more complete by adding related information, is a common data cleansing practice.
For example, appending addresses with phone numbers related to that address.
Data cleansing may also involve activities like, harmonization of data, and standardization of data. For example, harmonization of short codes (St, rd etc.) to actual words (street, road). Standardization of data is a means of changing a reference data set to a new standard, ex, use of standard codes.
- Data cleansing, data cleaning or data scrubbing is the process of detecting and correcting (or removing) corrupt or inaccurate records from a record set, table, or database. Used mainly in databases, the term refers to identifying incomplete, incorrect, inaccurate, irrelevant, etc. parts of the data and then replacing, modifying, or deleting this dirty data or coarse data.
2000
- (Sarawagi, 2000) ⇒ Sunita Sarawagi, editor. (2000). IEEE Data Engineering special issue on Data Cleaning. http://www.research.microsoft.com/research/db/debull/A00dec/issue.htm.
1999
- (Zaiane, 1999) ⇒ Osmar Zaiane. (1999). “Glossary of Data Mining Terms." University of Alberta, Computing Science CMPUT-690: Principles of Knowledge Discovery in Databases.
- QUOTE: Data Cleansing: Also Data Cleaning. The process of ensuring that all values in a dataset are consistent and correctly recorded by removing redundancies and inconsistencies in data.
1998
- (Kohavi & Provost, 1998) ⇒ Ron Kohavi, and Foster Provost. (1998). “Glossary of Terms.” In: Machine Leanring 30(2-3).
- Data cleaning/cleansing: The process of improving the quality of the data by modifying its form or content, for example by removing or correcting data values that are incorrect. This step usually precedes the machine learning step, although the knowledge discovery process may indicate that further cleaning is desired and may suggest ways to improve the quality of the data. For example, learning that the pattern Wife implies Female from the census sample at UCI has a few exceptions may indicate a quality problem.