Continuous realtime gesture following and recognition

  • Authors:
  • Frédéric Bevilacqua;Bruno Zamborlin;Anthony Sypniewski;Norbert Schnell;Fabrice Guédy;Nicolas Rasamimanana

  • Affiliations:
  • Real Time Musical Interactions Team, IRCAM, CNRS - STMS, Paris, France;Real Time Musical Interactions Team, IRCAM, CNRS - STMS, Paris, France;Real Time Musical Interactions Team, IRCAM, CNRS - STMS, Paris, France;Real Time Musical Interactions Team, IRCAM, CNRS - STMS, Paris, France;Real Time Musical Interactions Team, IRCAM, CNRS - STMS, Paris, France;Real Time Musical Interactions Team, IRCAM, CNRS - STMS, Paris, France

  • Venue:
  • GW'09 Proceedings of the 8th international conference on Gesture in Embodied Communication and Human-Computer Interaction
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a HMM based system for real-time gesture analysis. The system outputs continuously parameters relative to the gesture time progression and its likelihood. These parameters are computed by comparing the performed gesture with stored reference gestures. The method relies on a detailed modeling of multidimensional temporal curves. Compared to standard HMM systems, the learning procedure is simplified using prior knowledge allowing the system to use a single example for each class. Several applications have been developed using this system in the context of music education, music and dance performances and interactive installation. Typically, the estimation of the time progression allows for the synchronization of physical gestures to sound files by time stretching/compressing audio buffers or videos.