On 3D model construction by fusing heterogeneous sensor data
CVGIP: Image Understanding
ER isomorphisms and uniqueness conditions
Data & Knowledge Engineering
A hybrid approach to convert relational schema to object-oriented schema
Information Sciences: an International Journal
Data & Knowledge Engineering
Accessing heterogenous sources of evidence to answer clinical questions
Computers and Biomedical Research
View selection for designing the global data warehouse
Data & Knowledge Engineering - Data warehousing
Heterogeneous database integration in biomedicine
Computers and Biomedical Research
Correspondence and translation for heterogeneous data
Theoretical Computer Science
A survey of approaches to automatic schema matching
The VLDB Journal — The International Journal on Very Large Data Bases
Information Sciences—Informatics and Computer Science: An International Journal
ACM Transactions on Database Systems (TODS)
Efficient similarity-based operations for data integration
Data & Knowledge Engineering
Metadata management: past, present and future
Decision Support Systems
Clustering classifiers for knowledge discovery from physically distributed databases
Data & Knowledge Engineering
Methods for automated concept mapping between medical databases
Journal of Biomedical Informatics
Design of a meta model for integrating enterprise systems
Computers in Industry
XML application schema matching using similarity measure and relaxation labeling
Information Sciences: an International Journal
Conceptual modelling of web information systems
Data & Knowledge Engineering
Next generation of methods and tools for team work based care in speech and language therapy
Telematics and Informatics
Disclosure risk measures for microdata
SSDBM '03 Proceedings of the 15th International Conference on Scientific and Statistical Database Management
A Guide To The Project Management Body Of Knowledge (PMBOK Guides)
A Guide To The Project Management Body Of Knowledge (PMBOK Guides)
An ontology based approach to the integration of entity-relationship schemas
Data & Knowledge Engineering - Special issue: ER 2004
Databases and the geometry of knowledge
Data & Knowledge Engineering
A framework for distributed mediation of temporal-abstraction queries to clinical databases
Artificial Intelligence in Medicine
Artificial Intelligence in Medicine
Pivoting approaches for bulk extraction of Entity-Attribute-Value data
Computer Methods and Programs in Biomedicine
Integration of Genomic, Proteomic and Biomedical Information on the Semantic Web
ER '08 Proceedings of the ER 2008 Workshops (CMLSA, ECDM, FP-UML, M2AS, RIGiM, SeCoGIS, WISM) on Advances in Conceptual Modeling: Challenges and Opportunities
Translational integrity and continuity: Personalized biomedical data integration
Journal of Biomedical Informatics
Assessing communication processes within integrated health information systems
BioMED '08 Proceedings of the Sixth IASTED International Conference on Biomedical Engineering
Artificial Intelligence in Medicine
An ontological modeling approach to cerebrovascular disease studies: The NEUROWEB case
Journal of Biomedical Informatics
A context-based schema integration process applied to healthcare data sources
OTM'10 Proceedings of the 2010 international conference on On the move to meaningful internet systems
Cancer data integration and querying with genetegra
DILS'12 Proceedings of the 8th international conference on Data Integration in the Life Sciences
Hi-index | 0.00 |
Producing reliable information is the ultimate goal of data processing. The ocean of data created with the advances of science and technologies calls for integration of data coming from heterogeneous sources that are diverse in their purposes, business rules, underlying models and enabling technologies. Reference models, Semantic Web, standards, ontology, and other technologies enable fast and efficient merging of heterogeneous data, while the reliability of produced information is largely defined by how well the data represent the reality. In this paper, we initiate a framework for assessing the informational value of data that includes data dimensions; aligning data quality with business practices; identifying authoritative sources and integration keys; merging models; uniting updates of varying frequency and overlapping or gapped data sets.