What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Interacting with eye movements in virtual environments
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Twenty years of eye typing: systems and design issues
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
Gaze- vs. hand-based pointing in virtual environments
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Travel in Immersive Virtual Environments: An Evaluation of Viewpoint Motion Control Techniques
VRAIS '97 Proceedings of the 1997 Virtual Reality Annual International Symposium (VRAIS '97)
Gaze Tracking for Multimodal Human-Computer Interaction
ICASSP '97 Proceedings of the 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '97) -Volume 4 - Volume 4
Virtual Environment Interaction Techniques
Virtual Environment Interaction Techniques
Gaze-based selection of standard-size menu items
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Use of eye movements for video game control
Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology
EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Noise tolerant selection by gaze-controlled pan and zoom in 3D
Proceedings of the 2008 symposium on Eye tracking research & applications
Snap clutch, a moded approach to solving the Midas touch problem
Proceedings of the 2008 symposium on Eye tracking research & applications
The inspection of very large images by eye-gaze control
AVI '08 Proceedings of the working conference on Advanced visual interfaces
Gaze-based interaction with massively multiplayer on-line games
CHI '09 Extended Abstracts on Human Factors in Computing Systems
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Designing gaze gestures for gaming: an investigation of performance
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
BlinkWrite2: an improved text entry method using eye blinks
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
"Moving to the centre": A gaze-driven remote camera control for teleoperation
Interacting with Computers
Designing gaze-supported multimodal interactions for the exploration of large image collections
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Investigating gaze-supported multimodal pan and zoom
Proceedings of the Symposium on Eye Tracking Research and Applications
Look & touch: gaze-supported target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gaze interaction in the post-WIMP world
CHI '12 Extended Abstracts on Human Factors in Computing Systems
GeoGazemarks: providing gaze history for the orientation on small display maps
Proceedings of the 14th ACM international conference on Multimodal interaction
Demo of gaze controlled flying
Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design
Hi-index | 0.00 |
Since eye gaze may serve as an efficient and natural input for steering in virtual 3D scenes, we investigate the design of eye gaze steering user interfaces (UIs) in this paper. We discuss design considerations and propose design alternatives based on two selected steering approaches differing in input condition (discrete vs. continuous) and velocity selection (constant vs. gradient-based). The proposed UIs have been iteratively advanced based on two user studies with twelve participants each. In particular, the combination of continuous and gradient-based input shows a high potential, because it allows for gradually changing the moving speed and direction depending on a user's point-of-regard. This has the advantage of reducing overshooting problems and dwell-time activations. We also investigate discrete constant input for which virtual buttons are toggled using gaze dwelling. As an alternative, we propose the Sticky Gaze Pointer as a more flexible way of discrete input.