The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Text input methods for eye trackers using off-screen targets
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
A Real-Time Continuous Gesture Recognition System for Sign Language
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
A SRN/HMM System for Signer-Independent Continuous Sign Language Recognition
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Eye Tracking Methodology: Theory and Practice
Eye Tracking Methodology: Theory and Practice
Longitudinal evaluation of discrete consecutive gaze gestures for text entry
Proceedings of the 2008 symposium on Eye tracking research & applications
Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze
PIT '08 Proceedings of the 4th IEEE tutorial and research workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multimodal Dialogue Systems
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Video-based signer-independent Arabic sign language recognition using hidden Markov models
Applied Soft Computing
Concept-based evidential reasoning for multimodal fusion in human-computer interaction
Applied Soft Computing
Evaluation of a low-cost open-source gaze tracker
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Designing gaze gestures for gaming: an investigation of performance
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Interacting with the computer using gaze gestures
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Optimizing hierarchical temporal memory for multivariable time series
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
Gaze gesture recognition with hierarchical temporal memory networks
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part I
Fixation Precision in High-Speed Noncontact Eye-Gaze Tracking
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.00 |
Predefined sequences of eye movements, or 'gaze gestures', can be consciously performed by humans and monitored non-invasively using remote video oculography. Gaze gestures hold great potential in human-computer interaction, HCI, as long as they can be easily assimilated by potential users, monitored using low cost gaze tracking equipment and machine learning algorithms are able to distinguish the spatio-temporal structure of intentional gaze gestures from typical gaze activity performed during standard HCI. In this work, an evaluation of the performance of a bioinspired Bayesian pattern recognition algorithm known as Hierarchical Temporal Memory (HTM) on the real time recognition of gaze gestures is carried out through a user study. To improve the performance of traditional HTM during real time recognition, an extension of the algorithm is proposed in order to adapt HTM to the temporal structure of gaze gestures. The extension consists of an additional top node in the HTM topology that stores and compares sequences of input data by sequence alignment using dynamic programming. The spatio-temporal codification of a gesture in a sequence serves the purpose of handling the temporal evolution of gaze gestures instances. The extended HTM allows for reliable discrimination of intentional gaze gestures from otherwise standard human-machine gaze interaction reaching up to 98% recognition accuracy for a data set of 10 categories of gaze gestures, acceptable completion speeds and a low rate of false positives during standard gaze-computer interaction. These positive results despite the low cost hardware employed supports the notion of using gaze gestures as a new HCI paradigm for the fields of accessibility and interaction with smartphones, tablets, projected displays and traditional desktop computers.