Vibrotactile rendering of head gestures for controlling electric wheelchair

  • Authors:
  • Shafiq ur Réhman;Bisser Raytchev;Ikushi Yoda;Li Liu

  • Affiliations:
  • Digital Media Lab., Tillämpad Fysik och Elektronik, Umeå Universitet, Sweden;National Institute of Advanced Industrial Science and Technology, Japan;National Institute of Advanced Industrial Science and Technology, Japan;Digital Media Lab., Tillämpad Fysik och Elektronik, Umeå Universitet, Sweden

  • Venue:
  • SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We have developed a head gesture controlled electric wheelchair system to aid persons with severe disabilities. Real-time range information obtained from a stereo camera is used to locate and segment the face images of the user from the sensed video. We use an Isomap based nonlinear manifold learning map of facial textures for head pose estimation. Our system is a non-contact vision system, making it much more convenient to use. The user is only required to gesture his/her head to command the wheelchair. To overcome problems with a non responding system, it is necessary to notify the user of the exact system state while the system is in use. In this paper, we explore the use of vibrotactile rendering of head gestures as feedback. Three different feedback systems are developed and tested, audio stimuli, vibrotactile stimuli and audio plus vibrotactile stimuli. We have performed user tests to study the usability of these three display methods. The usability studies show that the method using both audio plus vibrotactile response outperforms the other methods (i.e. audio stimuli, vibrotactile stimuli response).