Toward crowdsourcing micro-level behavior annotations: the challenges of interface, training, and generalization

  • Authors:
  • Sunghyun Park;Philippa Shoemark;Louis-Philippe Morency

  • Affiliations:
  • University of Southern California, Los Angeles, CA, USA;University of Southern California, Los Angeles, CA, USA;University of Southern California, Los Angeles, CA, USA

  • Venue:
  • Proceedings of the 19th international conference on Intelligent User Interfaces
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

Research that involves human behavior analysis usually requires laborious and costly efforts for obtaining micro-level behavior annotations on a large video corpus. With the emerging paradigm of crowdsourcing however, these efforts can be considerably reduced. We first present OCTAB (Online Crowdsourcing Tool for Annotations of Behaviors), a web-based annotation tool that allows precise and convenient behavior annotations in videos, directly portable to popular crowdsourcing platforms. As part of OCTAB, we introduce a training module with specialized visualizations. The training module's design was inspired by an observational study of local experienced coders, and it enables an iterative procedure for effectively training crowd workers online. Finally, we present an extensive set of experiments that evaluates the feasibility of our crowdsourcing approach for obtaining micro-level behavior annotations in videos, showing the reliability improvement in annotation accuracy when properly training online crowd workers. We also show the generalization of our training approach to a new independent video corpus.