Ontology alignment evaluation initiative: six years of experience

  • Authors:
  • Jérôme Euzenat;Christian Meilicke;Heiner Stuckenschmidt;Pavel Shvaiko;Cássia Trojahn

  • Affiliations:
  • INRIA & LIG, Grenoble, France;University of Mannheim, Germany;University of Mannheim, Germany;Informatica Trentina S.p.A., Trento, Italy;INRIA & LIG, Grenoble, France

  • Venue:
  • Journal on data semantics XV
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In the area of semantic technologies, benchmarking and systematic evaluation is not yet as established as in other areas of computer science, e.g., information retrieval. In spite of successful attempts, more effort and experience are required in order to achieve such a level of maturity. In this paper, we report results and lessons learned from the Ontology Alignment Evaluation Initiative (OAEI), a benchmarking initiative for ontology matching. The goal of this work is twofold: on the one hand, we document the state of the art in evaluating ontology matching methods and provide potential participants of the initiative with a better understanding of the design and the underlying principles of the OAEI campaigns. On the other hand, we report experiences gained in this particular area of semantic technologies to potential developers of benchmarking for other kinds of systems. For this purpose, we describe the evaluation design used in the OAEI campaigns in terms of datasets, evaluation criteria and workflows, provide a global view on the results of the campaigns carried out from 2005 to 2010 and discuss upcoming trends, both specific to ontology matching and generally relevant for the evaluation of semantic technologies. Finally, we argue that there is a need for a further automation of benchmarking to shorten the feedback cycle for tool developers.