What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Trials and tribulations of using an eye-tracking system
CHI '00 Extended Abstracts on Human Factors in Computing Systems
Communications of the ACM
Eye tracking system model with easy calibration
Proceedings of the 2004 symposium on Eye tracking research & applications
A free-head, simple calibration, gaze tracking system that enables gaze-based interaction
Proceedings of the 2004 symposium on Eye tracking research & applications
Eye Gaze Tracking under Natural Head Movements
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis
Proceedings of the 18th annual ACM symposium on User interface software and technology
Reducing shoulder-surfing by using gaze-based password entry
Proceedings of the 3rd symposium on Usable privacy and security
Remote point-of-gaze estimation requiring a single-point calibration for applications with infants
Proceedings of the 2008 symposium on Eye tracking research & applications
Eye gaze tracking techniques for interactive applications
Computer Vision and Image Understanding - Special issue on eye detection and tracking
Interacting with the computer using gaze gestures
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Calibration games: making calibration tasks enjoyable by adding motivating game elements
Proceedings of the 24th annual ACM symposium on User interface software and technology
Probabilistic gaze estimation without active personal calibration
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Look & touch: gaze-supported target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Novel Gaze Estimation System With One Calibration Point
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
SideWays: a gaze interface for spontaneous interaction with situated displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Pursuits: eye-based interaction with moving targets
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
Hi-index | 0.00 |
Eye gaze is a compelling interaction modality but requires user calibration before interaction can commence. State of the art procedures require the user to fixate on a succession of calibration markers, a task that is often experienced as difficult and tedious. We present pursuit calibration, a novel approach that, unlike existing methods, is able to detect the user's attention to a calibration target. This is achieved by using moving targets, and correlation of eye movement and target trajectory, implicitly exploiting smooth pursuit eye movement. Data for calibration is then only sampled when the user is attending to the target. Because of its ability to detect user attention, pursuit calibration can be performed implicitly, which enables more flexible designs of the calibration task. We demonstrate this in application examples and user studies, and show that pursuit calibration is tolerant to interruption, can blend naturally with applications and is able to calibrate users without their awareness.