KEA: practical automatic keyphrase extraction
Proceedings of the fourth ACM conference on Digital libraries
Ontology Learning for the Semantic Web
Ontology Learning for the Semantic Web
Measuring Similarity between Ontologies
EKAW '02 Proceedings of the 13th International Conference on Knowledge Engineering and Knowledge Management. Ontologies and the Semantic Web
Ontological Engineering
The Journal of Machine Learning Research
Learning Domain Ontologies from Document Warehouses and Dedicated Web Sites
Computational Linguistics
Determining termhood for learning domain ontologies using domain prevalence and tendency
AusDM '07 Proceedings of the sixth Australasian conference on Data mining and analytics - Volume 70
How Much Language Is Enough? Theoretical and Practical Use of the Business Process Modeling Notation
CAiSE '08 Proceedings of the 20th international conference on Advanced Information Systems Engineering
Competency evaluation of plant character ontologies against domain literature
Journal of the American Society for Information Science and Technology
BabelNet: building a very large multilingual semantic network
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
KX: A flexible system for keyphrase extraction
SemEval '10 Proceedings of the 5th International Workshop on Semantic Evaluation
To a method of evaluating ontologies
Journal of Computer and Systems Sciences International
Boosting Collaborative Ontology Building with Key-Concept Extraction
ICSC '11 Proceedings of the 2011 IEEE Fifth International Conference on Semantic Computing
On how to perform a gold standard based evaluation of ontology learning
ISWC'06 Proceedings of the 5th international conference on The Semantic Web
Hi-index | 0.00 |
We present a novel system for corpus-based terminological evaluation of ontologies. Starting from the assumption that a domain of interest can be represented through a corpus of text documents, we first extract a list of domain-specific key-concepts from the corpus, rank them by relevance, and then apply various evaluation metrics to assess the terminological coverage of a domain ontology with respect to the list of key-concepts.Among the advantages of the proposed approach, we remark that the framework is highly automatizable, requiring little human intervention. The evaluation framework is made available online through a collaborative wiki-based system, which can be accessed by different users, from domain experts to knowledge engineers.We performed a comprehensive experimental analysis of our approach, showing that the proposed ontology metrics allow for assessing the terminological coverage of an ontology with respect to a given domain, and that our framework can be effectively applied to many evaluation-related scenarios.