Perceptual user interfaces: perceptual intelligence
Communications of the ACM
Perceptual user interfaces: things that see
Communications of the ACM
On-road driver eye movement tracking using head-mounted devices
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
Wearable Computers: No Longer Science Fiction
IEEE Pervasive Computing
Multimodal Interaction for 2D and 3D Environments
IEEE Computer Graphics and Applications
Hybrid Tracking for Outdoor Augmented Reality Applications
IEEE Computer Graphics and Applications
Evaluating Integrated Speech- and Image Understanding
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
Monitoring Head/Eye Motion for Driver Alertness with One Camera
ICPR '00 Proceedings of the International Conference on Pattern Recognition - Volume 4
Head Gestures for Computer Control
RATFG-RTS '01 Proceedings of the IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems (RATFG-RTS'01)
Providing the basis for human-robot-interaction: a multi-modal attention system for a mobile robot
Proceedings of the 5th international conference on Multimodal interfaces
Combining speech and haptics for intuitive and efficient navigation through image databases
Proceedings of the 5th international conference on Multimodal interfaces
Tangible multimodal interfaces for safety-critical applications
Communications of the ACM - Multimodal interfaces that flex, adapt, and persist
Towards reliable multimodal sensing in aware environments
Proceedings of the 2001 workshop on Perceptive user interfaces
A perceptual user interface for recognizing head gesture acknowledgements
Proceedings of the 2001 workshop on Perceptive user interfaces
A real-time head nod and shake detector
Proceedings of the 2001 workshop on Perceptive user interfaces
Multimodal interaction in an augmented reality scenario
Proceedings of the 6th international conference on Multimodal interfaces
A cognitive vision system for action recognition in office environments
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Coordinating interactive vision behaviors for cognitive assistance
Computer Vision and Image Understanding
Hi-index | 0.00 |
As wearable sensors and computing hardware are becoming a reality, new and unorthodox approaches to seamless human-computer interaction can be explored. This paper presents the prototype of a wearable, head-mounted device for advanced human-machine interaction that integrates speech recognition and computer vision with head gesture analysis based on inertial sensor data. We will focus on the innovative idea of integrating visual and inertial data processing for interaction. Fusing head gestures with results from visual analysis of the environment provides rich vocabularies for human-machine communication because it renders the environment into an interface: if objects or items in the surroundings are being associated with system activities, head gestures can trigger commands if the corresponding object is being looked at. We will explain the algorithmic approaches applied in our prototype and present experiments that highlight its potential for assistive technology. Apart from pointing out a new direction for seamless interaction in general, our approach provides a new and easy to use interface for disabled and paralyzed users in particular.