Retrieval evaluation on focused tasks

  • Authors:
  • Besnik Fetahu;Ralf Schenkel

  • Affiliations:
  • Saarland University, Saarbrucken, Germany;Saarland University, Saarbrucken, Germany

  • Venue:
  • SIGIR '12 Proceedings of the 35th international ACM SIGIR conference on Research and development in information retrieval
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Ranking of retrieval systems for focused tasks requires large number of relevance judgments. We propose an approach that minimizes the number of relevance judgments, where the performance measures are approximated using a Monte-Carlo sampling technique. Partial measures are taken using relevance judgments, whereas the remaining part of passages are annotated using a generated relevance probability distribution based on result rank. We define two conditions for stopping the assessment procedure when the ranking between systems is stable.