Gestures are strings: efficient online gesture spotting and classification using string matching

  • Authors:
  • Thomas Stiefmeier;Daniel Roggen;Gerhard Tröster

  • Affiliations:
  • Wearable Computing Lab, ETH Zürich, Zürich, Switzerland;Wearable Computing Lab, ETH Zürich, Zürich, Switzerland;Wearable Computing Lab, ETH Zürich, Zürich, Switzerland

  • Venue:
  • Proceedings of the ICST 2nd international conference on Body area networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Context awareness is one mechanism that allows wearable computers to provide information proactively, unobtrusively and with minimal user disturbance. Gestures and activities are an important aspect of the user's context. Detection and classification of gestures may be computationally expensive for low-power, miniaturized wearable platforms, such as those that may be integrated into garments. In this paper we introduce a novel method for online and real-time spotting and classification of gestures. Continuous user motion, acquired from a body-worn network of inertial sensors, is represented by strings of symbols encoding motion vectors. Fast string matching techniques, inspired from bioinformatics, spot trained gestures and classify them. Robustness to gesture variability is provided by approximate matching efficiently implemented through dynamic programming. Our method is successfully demonstrated by spotting and classifying the occurrences of trained gestures within a continuous recording of a complex bicycle maintenance task. It executes in real-time on a desktop computer with a fraction of CPU time. Only simple integer arithmetic operations are required, which makes this method ideally suited for implementation on body-worn sensor nodes and real-time operation.