Explaining User Performance in Information Retrieval: Challenges to IR Evaluation

  • Authors:
  • Kalervo Järvelin

  • Affiliations:
  • University of Tampere, Finland

  • Venue:
  • ICTIR '09 Proceedings of the 2nd International Conference on Theory of Information Retrieval: Advances in Information Retrieval Theory
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The paper makes three points of significance for IR research: (1) The Cranfield paradigm of IR evaluation seems to lose power when one looks at human instead of system performance. (2) Searchers using IR systems in real-life use rather short queries, which individually often have poor performance. However, when used in sessions, they may be surprisingly effective. The searcher's strategies have not been sufficiently described and cannot therefore be properly understood, supported nor evaluated. (3) Searchers in real-life seek to optimize the entire information access process, not just result quality. Evaluation of output alone is insufficient to explain searcher behavior.