Gestures for industry: intuitive human-robot communication from human observation

  • Authors:
  • Brian Gleeson;Karon MacLean;Amir Haddadi;Elizabeth Croft;Javier Alcazar

  • Affiliations:
  • University of British Columbia, Vancouver, Canada;University of British Columbia, Vancouver, Canada;University of British Columbia, Vancouver, Canada;University of British Columbia, Vancouver, Canada;General Motors, Warren, USA

  • Venue:
  • Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Human-robot collaborative work has the potential to advance quality, efficiency and safety in manufacturing. In this paper we present a gestural communication lexicon for human-robot collaboration in industrial assembly tasks and establish methodology for producing such a lexicon. Our user experiments are grounded in a study of industry needs, providing potential real-world applicability to our results. Actions required for industrial assembly tasks are abstracted into three classes: part acquisition, part manipulation, and part operations. We analyzed the communication between human pairs performing these subtasks and derived a set of communication terms and gestures. We found that participant-provided gestures are intuitive and well suited to robotic implementation, but that interpretation is highly dependent on task context. We then implemented these gestures on a robot arm in a human-robot interaction context, and found the gestures to be easily interpreted by observers. We found that observation of human-human interaction can be effective in determining what should be communicated in a given human-robot task, how communication gestures should be executed, and priorities for robotic system implementation based on frequency of use.