Recognition of hand movements using wearable accelerometers

  • Authors:
  • Narayanan C. Krishnan;Colin Juillard;Dirk Colbry;Sethuraman Panchanathan

  • Affiliations:
  • (Correspd.) Center for Cognitive Ubiquitous Computing, Department of Computer Science and Engineering, School of Computing and Informatics, Arizona State University, Tempe, Arizona, USA. E-mail: { ...;Center for Cognitive Ubiquitous Computing, Department of Computer Science and Engineering, School of Computing and Informatics, Arizona State University, Tempe, Arizona, USA. E-mail: {narayanan.ck ...;Center for Cognitive Ubiquitous Computing, Department of Computer Science and Engineering, School of Computing and Informatics, Arizona State University, Tempe, Arizona, USA. E-mail: {narayanan.ck ...;Center for Cognitive Ubiquitous Computing, Department of Computer Science and Engineering, School of Computing and Informatics, Arizona State University, Tempe, Arizona, USA. E-mail: {narayanan.ck ...

  • Venue:
  • Journal of Ambient Intelligence and Smart Environments
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Accelerometer based activity recognition systems have typically focused on recognizing simple ambulatory activities of daily life, such as walking, sitting, standing, climbing stairs, etc. In this work, we developed and evaluated algorithms for detecting and recognizing short duration hand movements (lift to mouth, scoop, stir, pour, unscrew cap). These actions are a part of the larger and complex Instrumental Activities of Daily Life (IADL) making a drink and drinking. We collected data using small wireless tri-axial accelerometers worn simultaneously on different parts of the hand. Acceleration data for training was collected from 5 subjects, who also performed the two IADLs without being given specific instructions on how to complete them. Feature vectors (mean, variance, correlation, spectral entropy and spectral energy) were calculated and tested on three classifiers (AdaBoost, HMM, k-NN). AdaBoost showed the best performance, with an overall accuracy of 86% for detecting each of these hand actions. The results show that although some actions are recognized well with the generalized classifer trained on the subject-independent data, other actions require some amount of subject-specific training. We also observed an improvement in the performance of the system when multiple accelerometers placed on the right hand were used.