Improving search engines using human computation games

  • Authors:
  • Hao Ma;Raman Chandrasekar;Chris Quirk;Abhishek Gupta

  • Affiliations:
  • The Chinese University of Hong Kong, Hong Kong, Hong Kong;Microsoft Research, Redmond, USA;Microsoft Research, Redmond, USA;Georgia Institute of Technology, Atlanta, USA

  • Venue:
  • Proceedings of the 18th ACM conference on Information and knowledge management
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Work on evaluating and improving the relevance of web search engines typically use human relevance judgments or clickthrough data. Both these methods look at the problem of learning the mapping from queries to web pages. In this paper, we identify some issues with this approach, and suggest an alternative approach, namely, learning a mapping from web pages to queries. In particular, we use human computation games to elicit data about web pages from players that can be used to improve search. We describe three human computation games that we developed, with a focus on Page Hunt, a single-player game. We describe experiments we conducted with several hundred game players, highlight some interesting aspects of the data obtained and define the 'findability' metric. We also show how we automatically extract query alterations for use in query refinement using techniques from bitext matching. The data that we elicit from players has several other applications including providing metadata for pages and identifying ranking issues.