Real time gesture recognition using continuous time recurrent neural networks

  • Authors:
  • Gonzalo Bailador;Daniel Roggen;Gerhard Tröster;Gracián Triviño

  • Affiliations:
  • Univ. Politécnica de Madrid, Madrid, Spain;Wearable Computing Lab, ETH Zürich, Zürich, Switzerland;Wearable Computing Lab, ETH Zürich, Zürich, Switzerland;European Centre for Soft Computing, Edificio Científico Tecnológico, Asturias, Spain

  • Venue:
  • Proceedings of the ICST 2nd international conference on Body area networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a new approach to the problem of gesture recognition in real time using inexpensive accelerometers. This approach is based on the idea of creating specialized signal predictors for each gesture class. These signal predictors forecast future acceleration values from current ones. The errors between the measured acceleration of a given gesture and the predictors are used for classification. This approach is modular and allows for seamless inclusion of new gesture classes. These predictors are implemented using Continuous Time Recurrent Neural Networks (CTRNN). On the one hand, this kind of networks exhibits rich dynamical behaviour that is useful in gesture recognition and on the other, they have a relatively low computational cost that is interesting feature for real time systems.