“THAT’s what i was looking for”: comparing user-rated relevance with search engine rankings

  • Authors:
  • Sameer Patil;Sherman R. Alpert;John Karat;Catherine Wolf

  • Affiliations:
  • Department of Informatics, Donald Bren School of Information and Computer Sciences, University of California, Irvine, CA;I.B.M. T. J. Watson Research Center, Hawthorne, NY;I.B.M. T. J. Watson Research Center, Hawthorne, NY;I.B.M. T. J. Watson Research Center, Hawthorne, NY

  • Venue:
  • INTERACT'05 Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
  • Year:
  • 2005

Quantified Score

Hi-index 0.01

Visualization

Abstract

We present a lightweight tool to compare the relevance ranking provided by a search engine to the relevance as actually judged by the user performing the query. Using the tool, we conducted a user study with two different versions of the search engine for a large corporate web site with more than 1.8 million pages, and with the popular search engine GoogleTM. Our tool provides an inexpensive and efficient way to do this comparison, and can be easily extended to any search engine that provides an API. Relevance feedback from actual users can be used to assess precision and recall of a search engine’s retrieval algorithms and, perhaps more importantly, to tune its relevance ranking algorithms to better match user needs. We found the tool to be quite effective at comparing different versions of the same search engine, and for benchmarking by comparing against a standard.