Two gesture recognition systems for immersive math education of the deaf

  • Authors:
  • Nicoletta Adamo-Villani;Justin Heisler;Laura Arns

  • Affiliations:
  • Purdue University, West Lafayette, IN;Vicarious Visions, Menands, New York;Purdue University, West Lafayette, IN

  • Venue:
  • Proceedings of the First International Conference on Immersive Telecommunications
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

The general goal of our research is the creation of a natural and intuitive interface for navigation, interaction, and input/recognition of American Sign Language (ASL) math signs in immersive Virtual Environments (VE) for the Deaf. The specific objective of this work is the development of two new gesture recognition systems for SMILE™, an immersive learning game that employs a fantasy 3D virtual environment to engage deaf children in math-based educational tasks. Presently, SMILE includes standard VR interaction devices such as a 6DOF wand, a pair of pinch gloves, and a dance platform. In this paper we show a significant improvement of the application by proposing two new gesture control mechanisms: system (1) is based entirely on hand gestures and makes use of a pair of 18-sensor data gloves, system (2) is based on hand and body gestures and makes use of a pair of data gloves and a motion tracking system. Both interfaces support first-person motion control, object selection and manipulation, and real-time input/recognition of ASL numbers zero to twenty. Although the systems described in the paper rely on high-end, expensive hardware, they can be considered a first step toward the realization of an effective immersive sign language interface.