Prioritizing relevance judgments to improve the construction of IR test collections

  • Authors:
  • Mehdi Hosseini;Ingemar J. Cox;Natasa Milic-Frayling;Trevor Sweeting;Vishwa Vinay

  • Affiliations:
  • University College London, London, United Kingdom;University College London, London, United Kingdom;Microsoft Research Cambridge, Cambridge, United Kingdom;University College London, London, United Kingdom;Microsoft Research Cambridge, Cambridge, United Kingdom

  • Venue:
  • Proceedings of the 20th ACM international conference on Information and knowledge management
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider the problem of optimally allocating a fixed budget to construct a test collection with associated relevance judgements, such that it can (i) accurately evaluate the relative performance of the participating systems, and (ii) generalize to new, previously unseen systems. We propose a two stage approach. For a given set of queries, we adopt the traditional pooling method and use a portion of the budget to evaluate a set of documents retrieved by the participating systems. Next, we analyze the relevance judgments to prioritize the queries and remaining pooled documents for further relevance assessments. The query prioritization is formulated as a convex optimization problem, thereby permitting efficient solution and providing a flexible framework to incorporate various constraints. Query-document pairs with the highest priority scores are evaluated using the remaining budget. We evaluate our resource optimization approach on the TREC 2004 Robust track collection. We demonstrate that our optimization techniques are cost efficient and yield a significant improvement in the reusability of the test collections.