Using CrowdLogger for in situ information retrieval system evaluation

  • Authors:
  • Henry A. Feild;James Allan

  • Affiliations:
  • Endicott College, Beverly, MA, USA;University of Massachusetts Amherst, Amherst, MA, USA

  • Venue:
  • Proceedings of the 2013 workshop on Living labs for information retrieval evaluation
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

A major hurdle faced by many information retrieval researchers---especially in academia---is evaluating retrieval systems in the wild. Challenges include tapping into large user bases, collecting user behavior, and modifying a given retrieval system. We outline several options available to researchers to overcome these challenges along with their advantages and disadvantages. We then demonstrate how CrowdLogger, an open-source browser extension for Firefox and Google Chrome, can be used as an in situ evaluation platform.