Text input methods for eye trackers using off-screen targets
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Twenty years of eye typing: systems and design issues
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
Towards gaze-mediated interaction: Collecting solutions of the "Midas touch problem"
INTERACT '97 Proceedings of the IFIP TC13 Interantional Conference on Human-Computer Interaction
Journal of Intelligent Information Systems
Gazing and frowning as a new human--computer interaction technique
ACM Transactions on Applied Perception (TAP)
EyeDraw: enabling children with severe motor impairments to draw with their eyes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Longitudinal evaluation of discrete consecutive gaze gestures for text entry
Proceedings of the 2008 symposium on Eye tracking research & applications
Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze
Proceedings of the 2008 symposium on Eye tracking research & applications
Noise tolerant selection by gaze-controlled pan and zoom in 3D
Proceedings of the 2008 symposium on Eye tracking research & applications
Snap clutch, a moded approach to solving the Midas touch problem
Proceedings of the 2008 symposium on Eye tracking research & applications
Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze
PIT '08 Proceedings of the 4th IEEE tutorial and research workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multimodal Dialogue Systems
Fast gaze typing with an adjustable dwell time
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Understanding users and their needs
Universal Access in the Information Society - Special Issue: Communication by Gaze Interaction
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Interacting with the computer using gaze gestures
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Object selection in gaze controlled systems: What you don't look at is what you get
ACM Transactions on Applied Perception (TAP)
Real time head nod and shake detection using HMMs
KES'06 Proceedings of the 10th international conference on Knowledge-Based Intelligent Information and Engineering Systems - Volume Part III
Proceedings of the Symposium on Eye Tracking Research and Applications
WACV '12 Proceedings of the 2012 IEEE Workshop on the Applications of Computer Vision
Hi-index | 0.00 |
We propose a combination of gaze pointing and head gestures for enhanced hands-free interaction. Instead of the traditional dwell-time selection method, we experimented with five simple head gestures: nodding, turning left/right, and tilting left/right. The gestures were detected from the eye-tracking data by a range-based algorithm, which was found accurate enough in recognizing nodding and left-directed gestures. The gaze estimation accuracy did not noticeably suffer from the quick head motions. Participants pointed to nodding as the best gesture for occasional selections tasks and rated the other gestures as promising methods for navigation (turning) and functional mode switching (tilting). In general, dwell time works well for repeated tasks such as eye typing. However, considering multimodal games or transient interactions in pervasive and mobile environments, we believe a combination of gaze and head interaction could potentially provide a natural and more accurate interaction method.