The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
New technological windows into mind: there is more in eyes and brains for human-computer interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Text input methods for eye trackers using off-screen targets
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
openEyes: a low-cost head-mounted eye-tracking solution
Proceedings of the 2006 symposium on Eye tracking research & applications
Eye Tracking Methodology: Theory and Practice
Eye Tracking Methodology: Theory and Practice
Longitudinal evaluation of discrete consecutive gaze gestures for text entry
Proceedings of the 2008 symposium on Eye tracking research & applications
It's in your eyes: towards context-awareness and mobile HCI using wearable EOG goggles
UbiComp '08 Proceedings of the 10th international conference on Ubiquitous computing
Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze
PIT '08 Proceedings of the 4th IEEE tutorial and research workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multimodal Dialogue Systems
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Evaluation of a low-cost open-source gaze tracker
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Designing gaze gestures for gaming: an investigation of performance
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Interacting with the computer using gaze gestures
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Optimizing hierarchical temporal memory for multivariable time series
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
Enhanced interactive gaming by blending full-body tracking and gesture animation
ACM SIGGRAPH ASIA 2010 Sketches
Gaze gesture recognition with hierarchical temporal memory networks
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part I
Gesture-based interaction and communication: automated classification of hand gesture contours
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Fixation Precision in High-Speed Noncontact Eye-Gaze Tracking
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.00 |
Eye movements can be consciously controlled by humans to the extent of performing sequences of predefined movement patterns, or 'gaze gestures'. Gaze gestures can be tracked noninvasively employing a video-based eye tracking system. Gaze gestures hold the potential to become an emerging input paradigm in the context of human-computer interaction (HCI) as low-cost eye trackers become more ubiquitous. The viability of gaze gestures as an innovative way to control a computer rests on how easily they can be assimilated by potential users and also on the ability of machine learning algorithms to discriminate in real time intentional gaze gestures from typical gaze activity performed during standard interaction with electronic devices. In this work, through a set of experiments and user studies, we evaluate the performance of two different gaze gestures modalities, gliding gaze gestures and saccadic gaze gestures, and their corresponding real-time recognition algorithms, Hierarchical Temporal Memory networks and the Needleman-Wunsch algorithm for sequence alignment. Our results show that a specific combination of gaze gesture modality, namely saccadic gaze gestures, and recognition algorithm, Needleman-Wunsch, allows for reliable usage of intentional gaze gestures to interact with a computer with accuracy rates higher than 95% and completion speeds of around 1.5 to 2.5 seconds per gesture. The optimal gaze gesture modality and recognition algorithm do not interfere with otherwise standard human-computer gaze interaction, generating very few false positives during real time recognition and positive feedback from the users. These encouraging results and the low cost eye tracking equipment used, open up a new HCI paradigm for the fields of accessibility and interaction with smartphones, tablets, projected displays and traditional desktop computers.