A multi-touchpad based human computer gesture interface for people with disabilities

  • Authors:
  • Yu Yuan;Kenneth Barner

  • Affiliations:
  • University of Delaware, Newark, DE;University of Delaware, Newark, DE

  • Venue:
  • Telehealth/AT '08 Proceedings of the IASTED International Conference on Telehealth/Assistive Technologies
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a multi-touchpad based human computer interaction (HCI) system called MGesture, which uses tactile gestures as input to perform common keyboard and mouse control. MGesture interface expands the ability of the multi-touchpad to incorporate an advanced tactile gesture recognition engine and a fully customizable command processor. Users with physical disabilities can define their own gesture sets by providing training samples and associating them with corresponding computer commands. This application is equipped with two gesture recognition algorithms: recurrent neural networks and support vector machines based recognition. Preliminary experiments show the effectiveness of the gesture recognition interface and its usage within the multi-touchapd framework for users with disabilities.