What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Effective eye-gaze input into Windows
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Twenty years of eye typing: systems and design issues
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
What's in the eyes for attentive input
Communications of the ACM
Auditory and visual feedback during eye typing
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Features of Eye Gaze Interface for Selection Tasks
APCHI '98 Proceedings of the Third Asian Pacific Computer and Human Interaction
Gazing and frowning as a new human--computer interaction technique
ACM Transactions on Applied Perception (TAP)
On-line adjustment of dwell time for target selection by gaze
Proceedings of the third Nordic conference on Human-computer interaction
Journal of Cognitive Neuroscience
Longitudinal evaluation of discrete consecutive gaze gestures for text entry
Proceedings of the 2008 symposium on Eye tracking research & applications
Eye-S: a full-screen input modality for pure eye-based communication
Proceedings of the 2008 symposium on Eye tracking research & applications
Gazing with pEYEs: towards a universal input for various applications
Proceedings of the 2008 symposium on Eye tracking research & applications
Fast gaze typing with an adjustable dwell time
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Alternatives to single character entry and dwell time selection on eye typing
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Interacting with the computer using gaze gestures
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Dynamic context switching for gaze based interaction
Proceedings of the Symposium on Eye Tracking Research and Applications
Enhanced gaze interaction using simple head gestures
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Learning relevance from natural eye movements in pervasive interfaces
Proceedings of the 14th ACM international conference on Multimodal interaction
Hi-index | 0.01 |
Controlling computers using eye movements can provide a fast and efficient alternative to the computer mouse. However, implementing object selection in gaze-controlled systems is still a challenge. Dwell times or fixations on a certain object typically used to elicit the selection of this object show several disadvantages. We studied deviations of critical thresholds by an individual and task-specific adaptation method. This demonstrated an enormous variability of optimal dwell times. We developed an alternative approach using antisaccades for selection. For selection by antisaccades, highlighted objects are copied to one side of the object. The object is selected when fixating to the side opposed to that copy requiring to inhibit an automatic gaze shift toward new objects. Both techniques were compared in a selection task. Two experiments revealed superior performance in terms of errors for the individually adapted dwell times. Antisaccades provide an alternative approach to dwell time selection, but they did not show an improvement over dwell time. We discuss potential improvements in the antisaccade implementation with which antisaccades might become a serious alternative to dwell times for object selection in gaze-controlled systems.