Analyzing features for activity recognition

  • Authors:
  • Tâm Huynh;Bernt Schiele

  • Affiliations:
  • Multimodal Interactive Systems, TU Darmstadt, Germany;Multimodal Interactive Systems, TU Darmstadt, Germany

  • Venue:
  • Proceedings of the 2005 joint conference on Smart objects and ambient intelligence: innovative context-aware services: usages and technologies
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Human activity is one of the most important ingredients of context information. In wearable computing scenarios, activities such as walking, standing and sitting can be inferred from data provided by body-worn acceleration sensors. In such settings, most approaches use a single set of features, regardless of which activity to be recognized. In this paper we show that recognition rates can be improved by careful selection of individual features for each activity. We present a systematic analysis of features computed from a real-world data set and show how the choice of feature and the window length over which the feature is computed affects the recognition rates for different activities. Finally, we give a recommendation of suitable features and window lengths for a set of common activities.