A design space for multimodal systems: concurrent processing and data fusion
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
Applications of face and gesture recognition for human-computer interaction
MULTIMEDIA '98 Proceedings of the sixth ACM international conference on Multimedia: Face/gesture recognition and their applications
Recognition of Head Gestures Using Hidden Markov Models
ICPR '96 Proceedings of the International Conference on Pattern Recognition (ICPR '96) Volume III-Volume 7276 - Volume 7276
Face and body gesture recognition for a vision-based multimodal analyzer
VIP '05 Proceedings of the Pan-Sydney area workshop on Visual information processing
Hand gesture recognition and virtual game control based on 3D accelerometer and EMG sensors
Proceedings of the 14th international conference on Intelligent user interfaces
Hand Gesture Recognition Research Based on Surface EMG Sensors and 2D-accelerometers
ISWC '07 Proceedings of the 2007 11th IEEE International Symposium on Wearable Computers
Multimodal Interfaces: A Survey of Principles, Models and Frameworks
Human Machine Interaction
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Hi-index | 0.00 |
This paper presents a novel approach for multimodal interaction which combines user mental activity (thoughts and emotions), user facial expressions and user head movements. In order to avoid problems related to computer vision (sensitivity to lighting changes, reliance on camera position, etc.), the proposed approach doesn't make use of optical techniques. Furthermore, in order to make human communication and control smooth, and avoid other environmental artifacts, the used information is non-verbal. The head's movements (rotations) are detected by a bi-axial gyroscope; the expressions and gaze are identified by electromyography and electrooculargraphy; the emotions and the thoughts are monitored by electroencephalography. In order to validate the proposed approach we developed an application where the user can navigate through a virtual world using his head. We chose Google Street View as virtual world. The developed application was conceived for a further integration with a electric wheelchair in order to replace the virtual world with a real world. A first evaluation of the system is provided.