The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Acquisition of expanding targets
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Application of Fitts' law to eye gaze interaction
CHI '00 Extended Abstracts on Human Factors in Computing Systems
Human on-line response to target expansion
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gaze-based selection of standard-size menu items
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Efficient eye pointing with a fisheye lens
GI '05 Proceedings of Graphics Interface 2005
Improving hands-free menu selection using eyegaze glances and fixations
Proceedings of the 2008 symposium on Eye tracking research & applications
Noise tolerant selection by gaze-controlled pan and zoom in 3D
Proceedings of the 2008 symposium on Eye tracking research & applications
Improving eye cursor's stability for eye pointing tasks
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The inspection of very large images by eye-gaze control
AVI '08 Proceedings of the working conference on Advanced visual interfaces
GazeSpace: eye gaze controlled content spaces
BCS-HCI '07 Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...but not as we know it - Volume 2
Considerations for Using Eye Trackers during Usability Studies
IDGD '09 Proceedings of the 3rd International Conference on Internationalization, Design and Global Development: Held as Part of HCI International 2009
The MAGIC Touch: Combining MAGIC-Pointing with a Touch-Sensitive Mouse
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Instantaneous saccade driven eye gaze interaction
Proceedings of the International Conference on Advances in Computer Enterntainment Technology
Modeling dwell-based eye pointing target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Designing gaze-supported multimodal interactions for the exploration of large image collections
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Comparison of gaze-to-objects mapping algorithms
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Look & touch: gaze-supported target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of video-based pointing and selection techniques for hands-free text entry
Proceedings of the International Working Conference on Advanced Visual Interfaces
Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
Hi-index | 0.00 |
Recent evidence on the performance benefits of expanding targets during manual pointing raises a provocative question: Can a similar effect be expected for eye gaze interaction? We present two experiments to examine the benefits of target expansion during an eye-controlled selection task. The second experiment also tested the efficiency of a "grab-and-hold algorithm" to counteract inherent eye jitter. Results confirm the benefits of target expansion both in pointing speed and accuracy. Additionally, the grab-and-hold algorithm affords a dramatic 57% reduction in error rates overall. The reduction is as much as 68% for targets subtending 0.35 degrees of visual angle. However, there is a cost which surfaces as a slight increase in movement time (10%). These findings indicate that target expansion coupled with additional measures to accommodate eye jitter has the potential to make eye gaze a more suitable input modality.