Enabling always-available input with muscle-computer interfaces

  • Authors:
  • T. Scott Saponas;Desney S. Tan;Dan Morris;Ravin Balakrishnan;Jim Turner;James A. Landay

  • Affiliations:
  • University of Washington, Seattle, WA, USA;Microsoft Research, Redmond, WA, USA;Microsoft Research, Redmond, WA, USA;University of Toronto, Toronto, ON, Canada;Microsoft Corporation, Redmond, WA, USA;University of Washington, Seattle, WA, USA

  • Venue:
  • Proceedings of the 22nd annual ACM symposium on User interface software and technology
  • Year:
  • 2009

Quantified Score

Hi-index 0.02

Visualization

Abstract

Previous work has demonstrated the viability of applying offline analysis to interpret forearm electromyography (EMG) and classify finger gestures on a physical surface. We extend those results to bring us closer to using muscle-computer interfaces for always-available input in real-world applications. We leverage existing taxonomies of natural human grips to develop a gesture set covering interaction in free space even when hands are busy with other objects. We present a system that classifies these gestures in real-time and we introduce a bi-manual paradigm that enables use in interactive systems. We report experimental results demonstrating four-finger classification accuracies averaging 79% for pinching, 85% while holding a travel mug, and 88% when carrying a weighted bag. We further show generalizability across different arm postures and explore the tradeoffs of providing real-time visual feedback.