Oculo-motor stabilization reflexes: integration of inertial and visual information
Neural Networks - Special issue on neural control and robotics: biology and technology
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Text input methods for eye trackers using off-screen targets
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Multiple view geometry in computer visiond
Multiple view geometry in computer visiond
Journal of Intelligent Information Systems
Head Gestures for Computer Control
RATFG-RTS '01 Proceedings of the IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems (RATFG-RTS'01)
A perceptual user interface for recognizing head gesture acknowledgements
Proceedings of the 2001 workshop on Perceptive user interfaces
Eye-S: a full-screen input modality for pure eye-based communication
Proceedings of the 2008 symposium on Eye tracking research & applications
Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze
PIT '08 Proceedings of the 4th IEEE tutorial and research workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multimodal Dialogue Systems
In the Eye of the Beholder: A Survey of Models for Eyes and Gaze
IEEE Transactions on Pattern Analysis and Machine Intelligence
Designing gaze gestures for gaming: an investigation of performance
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Interacting with the computer using gaze gestures
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Mobile gaze-based screen interaction in 3D environments
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Enhanced gaze interaction using simple head gestures
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Cross-device eye-based interaction
Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
A novel method for video-based head gesture recognition using eye information by an eye tracker has been proposed. The method uses a combination of gaze and eye movement to infer head gestures. Compared to other gesture-based methods a major advantage of the method is that the user keeps the gaze on the interaction object while interacting. This method has been implemented on a head-mounted eye tracker for detecting a set of predefined head gestures. The accuracy of the gesture classifier is evaluated and verified for gaze-based interaction in applications intended for both large public displays and small mobile phone screens. The user study shows that the method detects a set of defined gestures reliably.