Relevance feedback revisited

  • Authors:
  • Donna Harman

  • Affiliations:
  • National Institute of Standards and Technology, Gaithersburg, Maryland

  • Venue:
  • SIGIR '92 Proceedings of the 15th annual international ACM SIGIR conference on Research and development in information retrieval
  • Year:
  • 1992

Quantified Score

Hi-index 0.00

Visualization

Abstract

Researchers have found relevance feedback to be effective in interactive information retrieval, although few formal user experiments have been made. In order to run a user experiment on a large document collection, experiments were performed at NIST to complete some of the missing links found in using the probabilistic retrieval model. These experiments, using the Cranfield 1400 collection, showed the importance of query expansion in addition to query reweighting, and showed that adding as few as 20 well-selected terms could result in performance improvements of over 100%. Additionally it was shown that performing multiple iterations of feedback is highly effective.