A real-time system for motion retrieval and interpretation

  • Authors:
  • Mathieu Barnachon;Saïda Bouakaz;Boubakeur Boufama;Erwan Guillou

  • Affiliations:
  • Université de Lyon, CNRS, Université Lyon 1, LIRIS, UMR5205, 8 bd Niels Bohr, F-69622 Villeurbanne, France;Université de Lyon, CNRS, Université Lyon 1, LIRIS, UMR5205, 8 bd Niels Bohr, F-69622 Villeurbanne, France;School of Computer Science, University of Windsor, Canada N9B 3P4;Université de Lyon, CNRS, Université Lyon 1, LIRIS, UMR5205, 8 bd Niels Bohr, F-69622 Villeurbanne, France

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2013

Quantified Score

Hi-index 0.10

Visualization

Abstract

This paper proposes a new examplar-based method for real-time human motion recognition using Motion Capture (MoCap) data. We have formalized streamed recognizable actions, coming from an online MoCap engine, into a motion graph that is similar to an animation motion graph. This graph is used as an automaton to recognize known actions as well as to add new ones. We have defined and used a spatio-temporal metric for similarity measurements to achieve more accurate feedbacks on classification. The proposed method has the advantage of being linear and incremental, making the recognition process very fast and the addition of a new action straightforward. Furthermore, actions can be recognized with a score even before they are fully completed. Thanks to the use of a skeleton-centric coordinate system, our recognition method has become view-invariant. We have successfully tested our action recognition method on both synthetic and real data. We have also compared our results with four state-of-the-art methods using three well known datasets for human action recognition. In particular, the comparisons have clearly shown the advantage of our method through better recognition rates.