Shared user-computer control of a robotic wheelchair system
Shared user-computer control of a robotic wheelchair system
Head Pose Estimation by Nonlinear Manifold Learning
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 4 - Volume 04
Active vision for controlling an electric wheelchair
Intelligent Service Robotics
Hi-index | 0.00 |
We have developed a head gesture controlled electric wheelchair system to aid persons with severe disabilities. Real-time range information obtained from a stereo camera is used to locate and segment the face images of the user from the sensed video. We use an Isomap based nonlinear manifold learning map of facial textures for head pose estimation. Our system is a non-contact vision system, making it much more convenient to use. The user is only required to gesture his/her head to command the wheelchair. To overcome problems with a non responding system, it is necessary to notify the user of the exact system state while the system is in use. In this paper, we explore the use of vibrotactile rendering of head gestures as feedback. Three different feedback systems are developed and tested, audio stimuli, vibrotactile stimuli and audio plus vibrotactile stimuli. We have performed user tests to study the usability of these three display methods. The usability studies show that the method using both audio plus vibrotactile response outperforms the other methods (i.e. audio stimuli, vibrotactile stimuli response).