Optically sensing tongue gestures for computer input

  • Authors:
  • T. Scott Saponas;Daniel Kelly;Babak A. Parviz;Desney S. Tan

  • Affiliations:
  • University of Washington, Seattle, WA, USA;University of Washington, Seattle, WA, USA;University of Washington, Seattle, WA, USA;Microsoft Research, Redmond, WA, USA

  • Venue:
  • Proceedings of the 22nd annual ACM symposium on User interface software and technology
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many patients with paralyzing injuries or medical conditions retain the use of their cranial nerves, which control the eyes, jaw, and tongue. While researchers have explored eye-tracking and speech technologies for these patients, we believe there is potential for directly sensing explicit tongue movement for controlling computers. In this paper, we describe a novel approach of using infrared optical sensors embedded within a dental retainer to sense tongue gestures. We describe an experiment showing our system effectively discriminating between four simple gestures with over 90% accuracy. In this experiment, users were also able to play the popular game Tetris with their tongues. Finally, we present lessons learned and opportunities for future work.