An infrastructure for the evaluation and comparison of information retrieval systems

  • Authors:
  • Robert E. Broadbent;Gary S. Saunders;Joseph J. Ekstrom

  • Affiliations:
  • BYU, Provo, UT;BYU, Provo, UT;BYU, Provo, UT

  • Venue:
  • Proceedings of the 7th conference on Information technology education
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Even though information retrieval systems have been successfully deployed for over 45 years, the field continues to evolve in performance, functionality, and accuracy. There are hundreds of different products available with different indexing and retrieval characteristics. How does one choose the appropriate system for a given application? The first step in that choice is the creation of a framework for comparison of IR products and an infrastructure that supports automated execution and analysis of testing results. The next step is providing an environment for subjective measurement using human evaluators. In this paper we briefly introduce the concepts used in IR system evaluation and report on our initial implementation of a framework for evaluating indexing performance. We also report a test case, which provides a comparative analysis of the indexing characteristics for three IR system implementations using a common collection of documents.