Methodology for allocating resources for data quality enhancement
Communications of the ACM
Anchoring data quality dimensions in ontological foundations
Communications of the ACM
A product perspective on total data quality management
Communications of the ACM
Improving data warehouse and business information quality: methods for reducing costs and increasing profits
Data Quality
Data integration: a theoretical perspective
Proceedings of the twenty-first ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
Data Quality for the Information Age
Data Quality for the Information Age
Quality-driven Integration of Heterogenous Information Systems
VLDB '99 Proceedings of the 25th International Conference on Very Large Data Bases
TAILOR: A Record Linkage Tool Box
ICDE '02 Proceedings of the 18th International Conference on Data Engineering
Precise Service Level Agreements
Proceedings of the 26th International Conference on Software Engineering
Utility-based resolution of data inconsistencies
Proceedings of the 2004 international workshop on Information quality in information systems
Web services on demand: WSLA-driven automated management
IBM Systems Journal
Information Systems - Special issue: Data quality in cooperative information systems
Completeness of integrated information sources
Information Systems - Special issue: Data quality in cooperative information systems
Beyond accuracy: what data quality means to data consumers
Journal of Management Information Systems
Verity: a QoS metric for selecting Web services and providers
WISEW'03 Proceedings of the Fourth international conference on Web information systems engineering workshops
Hi-index | 0.00 |
Formal frameworks exist that allow service providers and users to negotiate the quality of a service. While these agreements usually include non-functional service properties, the quality of the information offered by a provider is neglected. Yet, in important application scenarios, notably in those based on the Service-Oriented computing paradigm, the outcome of complex workflows is directly affected by the quality of the data involved. In this paper, we propose a model for formal data quality agreements between data providers and data consumers, and analyze its feasibility by showing how a provider may take data quality constraints into account as part of its data provisioning process. Our analysis of the technical issues involved suggests that this is a complex problem in general, although satisfactory algorithmic and architectural solutions can be found under certain assumptions. To support this claim, we describe an algorithm for dealing with constraints on the completeness of a query result with respect to a reference data source, and outline an initial provider architecture for managing more general data quality constraints.