A gaze-responsive self-disclosing display
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Eye tracking in advanced interface design
Virtual environments and advanced interface design
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The effect of information scent on searching information: visualizations of large tree structures
AVI '00 Proceedings of the working conference on Advanced visual interfaces
Improving focus targeting in interactive fisheye views
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Twenty years of eye typing: systems and design issues
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
Eye Tracking Methodology: Theory and Practice
Eye Tracking Methodology: Theory and Practice
Eye gaze interaction with expanding targets
CHI '04 Extended Abstracts on Human Factors in Computing Systems
EyeWindows: evaluation of eye-controlled zooming windows for focus selection
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Efficient eye pointing with a fisheye lens
GI '05 Proceedings of Graphics Interface 2005
A comparison of static and moving presentation modes for image collections
Proceedings of the working conference on Advanced visual interfaces
EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Exploring camera viewpoint control models for a multi-tasking setting in teleoperation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Designing gaze-supported multimodal interactions for the exploration of large image collections
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Quantitative study of geological target spotting with the use of eye tracking
Proceedings of the 2010 workshop on Eye gaze in intelligent human machine interaction
Designing gaze-based user interfaces for steering in virtual environments
Proceedings of the Symposium on Eye Tracking Research and Applications
Investigating gaze-supported multimodal pan and zoom
Proceedings of the Symposium on Eye Tracking Research and Applications
Look & touch: gaze-supported target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
The increasing availability and accuracy of eye gaze detection equipment has encouraged its use for both investigation and control. In this paper we present novel methods for navigating and inspecting extremely large images solely or primarily using eye gaze control. We investigate the relative advantages and comparative properties of four related methods: Stare-to-Zoom (STZ), in which control of the image position and resolution level is determined solely by the user's gaze position on the screen; Head-to-Zoom (HTZ) and Dual-to-Zoom (DTZ), in which gaze control is augmented by head or mouse actions; and Mouse-to-Zoom (MTZ), using conventional mouse input as an experimental control. The need to inspect large images occurs in many disciplines, such as mapping, medicine, astronomy and surveillance. Here we consider the inspection of very large aerial images, of which Google Earth is both an example and the one employed in our study. We perform comparative search and navigation tasks with each of the methods described, and record user opinions using the Swedish User-Viewer Presence Questionnaire. We conclude that, while gaze methods are effective for image navigation, they, as yet, lag behind more conventional methods and interaction designers may well consider combining these techniques for greatest effect.