CrowdStudy: general toolkit for crowdsourced evaluation of web interfaces

  • Authors:
  • Michael Nebeling;Maximilian Speicher;Moira C. Norrie

  • Affiliations:
  • ETH Zurich, Zurich, Switzerland;ETH Zurich, Zurich, Switzerland;ETH Zurich, Zurich, Switzerland

  • Venue:
  • Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

While traditional usability testing methods can be both time consuming and expensive, tools for automated usability evaluation tend to oversimplify the problem by limiting themselves to supporting only certain evaluation criteria, settings, tasks and scenarios. We present CrowdStudy, a general web toolkit that combines support for automated usability testing with crowdsourcing to facilitate large-scale online user testing. CrowdStudy is based on existing crowdsourcing techniques for recruiting workers and guiding them through complex tasks, but implements mechanisms specifically designed for usability studies, allowing testers to control user sampling and conduct evaluations for particular contexts of use. Our toolkit provides support for context-aware data collection and analysis based on an extensible set of metrics, as well as tools for managing, reviewing and analysing any collected data. The paper demonstrates several useful features of CrowdStudy for two different scenarios, and discusses the benefits and tradeoffs of using crowdsourced evaluation.