What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Communications of the ACM
Gaze and Speech in Attentive User Interfaces
ICMI '00 Proceedings of the Third International Conference on Advances in Multimodal Interfaces
EyePliances: attention-seeking devices that respond to visual attention
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Perceptual user interfaces using vision-based eye tracking
Proceedings of the 5th international conference on Multimodal interfaces
Eye gaze interaction with expanding targets
CHI '04 Extended Abstracts on Human Factors in Computing Systems
The effects of feedback on targeting with multiple moving targets
GI '04 Proceedings of the 2004 Graphics Interface Conference
Proceedings of the 17th annual ACM symposium on User interface software and technology
EyeWindows: evaluation of eye-controlled zooming windows for focus selection
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Sweep and point and shoot: phonecam-based interactions for large public displays
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Exploring bluetooth based mobile phone interaction with the hermes photo display
Proceedings of the 7th international conference on Human computer interaction with mobile devices & services
Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze
Proceedings of the 2008 symposium on Eye tracking research & applications
Noise tolerant selection by gaze-controlled pan and zoom in 3D
Proceedings of the 2008 symposium on Eye tracking research & applications
It's Mine, Don't Touch!: interactions at a large multi-touch display in a city centre
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Motion-pointing: target selection using elliptical motions
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An evaluation of techniques for selecting moving targets
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Interacting with the computer using gaze gestures
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Exploring the potential for touchless interaction in image-guided interventional radiology
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Target following performance in the presence of latency, jitter, and signal dropouts
Proceedings of Graphics Interface 2011
Gaze gestures or dwell-based interaction?
Proceedings of the Symposium on Eye Tracking Research and Applications
Look & touch: gaze-supported target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Increasing the security of gaze-based cued-recall graphical passwords using saliency masks
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
SideWays: a gaze interface for spontaneous interaction with situated displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Pursuit calibration: making gaze calibration less tedious and more flexible
Proceedings of the 26th annual ACM symposium on User interface software and technology
Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays
Proceedings of the 19th international conference on Intelligent User Interfaces
Hi-index | 0.00 |
Although gaze is an attractive modality for pervasive interactions, the real-world implementation of eye-based interfaces poses significant challenges, such as calibration. We present Pursuits, an innovative interaction technique that enables truly spontaneous interaction with eye-based interfaces. A user can simply walk up to the screen and readily interact with moving targets. Instead of being based on gaze location, Pursuits correlates eye pursuit movements with objects dynamically moving on the interface. We evaluate the influence of target speed, number and trajectory and develop guidelines for designing Pursuits-based interfaces. We then describe six realistic usage scenarios and implement three of them to evaluate the method in a usability study and a field study. Our results show that Pursuits is a versatile and robust technique and that users can interact with Pursuits-based interfaces without prior knowledge or preparation phase.