Real-Time Gesture Recognition, Evaluation and Feed-Forward Correction of a Multimodal Tai-Chi Platform

  • Authors:
  • Otniel Portillo-Rodriguez;Oscar O. Sandoval-Gonzalez;Emanuele Ruffaldi;Rosario Leonardi;Carlo Alberto Avizzano;Massimo Bergamasco

  • Affiliations:
  • PERCRO, Perpcetual Robotics Laboratory, Scuola Superiore Sant'Anna, Pisa, Italy;PERCRO, Perpcetual Robotics Laboratory, Scuola Superiore Sant'Anna, Pisa, Italy;PERCRO, Perpcetual Robotics Laboratory, Scuola Superiore Sant'Anna, Pisa, Italy;PERCRO, Perpcetual Robotics Laboratory, Scuola Superiore Sant'Anna, Pisa, Italy;PERCRO, Perpcetual Robotics Laboratory, Scuola Superiore Sant'Anna, Pisa, Italy;PERCRO, Perpcetual Robotics Laboratory, Scuola Superiore Sant'Anna, Pisa, Italy

  • Venue:
  • HAID '08 Proceedings of the 3rd international workshop on Haptic and Audio Interaction Design
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a multimodal system capable to understand and correct in real-time the movements of Tai-Chi students through the integration of audio-visual-tactile technologies. This platform acts like a virtual teacher that transfers the knowledge of five Tai-Chi movements using feed-back stimuli to compensate the errors committed by a user during the performance of the gesture. The fundamental components of this multimodal interface are the gesture recognition system (using k-means clustering, Probabilistic Neural Networks (PNN) and Finite State Machines (FSM)) and the real-time descriptor of motion which is used to compute and qualify the actual movements performed by the student respect to the movements performed by the master, obtaining several feedbacks and compensating this movement in real-time varying audio-visualtactile parameters of different devices. The experiments of this multimodal platform have confirmed that the quality of the movements performed by the students is improved significantly.