Shepherding the crowd yields better work

  • Authors:
  • Steven Dow;Anand Kulkarni;Scott Klemmer;Björn Hartmann

  • Affiliations:
  • Carnegie Mellon University, Pittsburgh, Pennsylvania, USA;University of California, Berkeley, Berkeley, California, USA;Stanford University, Stanford, California, USA;University of California, Berkeley, Berkeley, California, USA

  • Venue:
  • Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Micro-task platforms provide massively parallel, on-demand labor. However, it can be difficult to reliably achieve high-quality work because online workers may behave irresponsibly, misunderstand the task, or lack necessary skills. This paper investigates whether timely, task-specific feedback helps crowd workers learn, persevere, and produce better results. We investigate this question through Shepherd, a feedback system for crowdsourced work. In a between-subjects study with three conditions, crowd workers wrote consumer reviews for six products they own. Participants in the None condition received no immediate feedback, consistent with most current crowdsourcing practices. Participants in the Self-assessment condition judged their own work. Participants in the External assessment condition received expert feedback. Self-assessment alone yielded better overall work than the None condition and helped workers improve over time. External assessment also yielded these benefits. Participants who received external assessment also revised their work more. We conclude by discussing interaction and infrastructure approaches for integrating real-time assessment into online work.