Database design with common sense business reasoning and learning
ACM Transactions on Database Systems (TODS)
Electronic Commerce Research and Applications
Ontology learning: state of the art and open issues
Information Technology and Management
Ontology-based enterprise knowledge integration
Robotics and Computer-Integrated Manufacturing
Building a Conceptual Skeleton for Enterprise Architecture Specifications
Proceedings of the 2006 conference on Information Modelling and Knowledge Bases XVII
A proposal to evaluate ontology content
Applied Ontology
Proceedings of the 3rd international conference on Theory and practice of electronic governance
Application of ontological engineering in customs domain
KES'10 Proceedings of the 14th international conference on Knowledge-based and intelligent information and engineering systems: Part I
Adding integrity constraints to the semantic web for instance data evaluation
ISWC'10 Proceedings of the 9th international semantic web conference on The semantic web - Volume Part II
A novel multi-aspect consistency measurement for ontologies
Journal of Web Engineering
Ranking ontologies with AKTiveRank
ISWC'06 Proceedings of the 5th international conference on The Semantic Web
A two-phased ontology selection approach for semantic web
KES'05 Proceedings of the 9th international conference on Knowledge-Based Intelligent Information and Engineering Systems - Volume Part IV
Hi-index | 0.00 |
Ontologies are the platforms that enable the sharing and reuse of knowledge by establishing common vocabularies and semantic interpretations of terms. While ontologies may provide for reusability, sharability or both, the evaluation of their definitions and software environment is critical to the success of the final applications that reuse and share these definitions. If wrong definitions from the ontology coexist with specific knowledge formalized in the KB, the KBS may make poor or wrong conclusions. The lack of methods for evaluating ontologies in laboratories can be an obstacle to their use in companies. The paper presents a set of emerging ideas in evaluation of ontologies useful for: ontology developers in the lab, as a foundation from which to perform technical evaluations; end users of ontologies in companies, as a point of departure in the search for the best ontology for their systems; and future research, as a basis upon which to perform progressive and disciplined investigations in this area. After briefly exploring some general questions such as: why, what, when, how and where to evaluate; who evaluates; and, what to evaluate against, we focus on the definition of a set of criteria useful in the evaluation process. Finally, we use some of these criteria in the evaluation of the Bibliographic-Data ontology (T. Gruber, 1994).