Fusion of String-Matched Templates for Continuous Activity Recognition

  • Authors:
  • Thomas Stiefmeier;Daniel Roggen;Gerhard Troster

  • Affiliations:
  • Wearable Computing Lab, ETH Zürich, Switzerland, stiefmeier@ife.ee.ethz.ch;Wearable Computing Lab, ETH Zürich, Switzerland, droggen@ife.ee.ethz.ch;Wearable Computing Lab, ETH Zürich, Switzerland, troester@ife.ee.ethz.ch

  • Venue:
  • ISWC '07 Proceedings of the 2007 11th IEEE International Symposium on Wearable Computers
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a new method for continuous activity recognition based on fusion of string-matched activity templates. The underlying segmentation and spotting approach is carried out on several symbol streams in parallel. These streams represent motion trajectories of body limbs in Cartesian space, acquired from body-worn inertial sensors. First results of our method in a highly complex real-world application are presented. 8 subjects performed 3800 activity instances of a checking procedure in car assembly adding up to 480 minutes of recordings. Selecting 6 activity classes with 468 occurrences for first investigations, we obtained an accuracy of up to 87% for the user-dependent case.