Visual cues-based anticipation for percussionist-robot interaction

  • Authors:
  • Marcelo Cicconet;Mason Bretan;Gil Weinberg

  • Affiliations:
  • Georgia Tech, Atlanta, GA, USA;Georgia Tech, Atlanta, GA, USA;Georgia Tech, Atlanta, GA, USA

  • Venue:
  • HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Visual cues-based anticipation is a fundamental aspect of human-human interaction, and it plays an especially important role in the time demanding medium of group performance. In this work we explore the importance of visual gesture anticipation in music performance involving human and robot. We study the case in which a human percussionist is playing a four-piece percussion set, and a robot musician is playing either the marimba, or a three-piece percussion set. Computer Vision is used to embed anticipation in the robotic response to the human gestures. We developed two algorithms for anticipation, predicting the strike location about 10 mili-seconds or about 100 mili-seconds before it occurs. Using the second algorithm, we show that the robot outperforms, on average, a group of human subjects, in synchronizing its gesture with a reference strike. We also show that, in the tested group of users, having some time in advance is important for a human to synchronize the strike with a reference player, but, from a certain time, that good influence stops increasing.