Data cleansing or data scrubbing is the act of:
corrupt or inaccurate records from a record set, table, or database.
Used mainly in databases, the term refers to identifying incomplete, incorrect, inaccurate, irrelevant etc. parts of the data and then replacing, modifying or deleting this dirty data.
The goal is to get data ready for analysis.
After cleansing, a data set will be consistent with other similar data sets in the system. The inconsistencies detected or removed may have been originally caused by different data dictionary definitions of similar entities in different stores, may have been caused by user entry errors, or may have been corrupted in transmission or storage.
Probabilistic scripts for automating common-sense tasks - The idea is to clean a data set (state, city and rent) automatically through a bayesian probabilistic script that encode prior domain knowledge declaratively such as:
The process is:
More … https://probcomp.github.io/Gen/