You are here:
Publication details
Guildlines of Data Quality Issues for Data Integration in the Context of the TPC-DI Benchmark
Authors | |
---|---|
Year of publication | 2017 |
Type | Article in Proceedings |
Conference | Proceedings of the 19th International Conference on Enterprise Information Systems |
MU Faculty or unit | |
Citation | |
Web | Springer, indexed by SCOPUS, WoS, DBLP |
Doi | http://dx.doi.org/10.5220/0006334301350144 |
Field | Informatics |
Keywords | Data integration; Data quality; ETL; TPC-DI benchmark |
Description | Nowadays, many business intelligence or master data management initiatives are based on regular data integration, since data integration intends to extract and combine a variety of data sources, it is thus considered as a prerequisite for data analytics and management. More recently, TPC-DI is proposed as an industry benchmark for data integration. It is designed to benchmark the data integration and serve as a standardisation to evaluate the ETL performance. There are a variety of data quality problems such as multi-meaning attributes and inconsistent data schemas in source data, which will not only cause problems for the data integration process but also affect further data mining or data analytics. This paper has summarised typical data quality problems in the data integration and adapted the traditional data quality dimensions to classify those data quality problems. We found that data completeness, timeliness and consistency are critical for data quality management in data integration, and data consistency should be further defined in the pragmatic level. In order to prevent typical data quality problems and proactively manage data quality in ETL, we proposed a set of practical guidelines for researchers and practitioners to conduct data quality management in data integration. |