Overview of the answer validation exercise 2006

  • Authors:
  • Anselmo Peñas;Álvaro Rodrigo;Valentín Sama;Felisa Verdejo

  • Affiliations:
  • Dpto. Lenguajes y Sistemas Informáticos, UNED;Dpto. Lenguajes y Sistemas Informáticos, UNED;Dpto. Lenguajes y Sistemas Informáticos, UNED;Dpto. Lenguajes y Sistemas Informáticos, UNED

  • Venue:
  • CLEF'06 Proceedings of the 7th international conference on Cross-Language Evaluation Forum: evaluation of multilingual and multi-modal information retrieval
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

The first Answer Validation Exercise (AVE) has been launched at the Cross Language Evaluation Forum 2006. This task is aimed at developing systems able to decide whether the answer of a Question Answering system is correct or not. The exercise is described here together with the evaluation methodology and the systems results. The starting point for the AVE 2006 was the reformulation of Answer Validation as a Recognizing Textual Entailment problem, under the assumption that the hypothesis can be automatically generated instantiating hypothesis patterns with the QA systems' answers. 11 groups have participated with 38 runs in 7 different languages. Systems that reported the use of Logic have obtained the best results in their respective subtasks.