What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The “prince” technique: Fitts' law and selection using area cursors
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Intelligent gaze-added interfaces
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Button selection for general GUIs using eye and hand together
AVI '00 Proceedings of the working conference on Advanced visual interfaces
Effective eye-gaze input into Windows
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Hand eye coordination patterns in target selection
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Application of Fitts' law to eye gaze interaction
CHI '00 Extended Abstracts on Human Factors in Computing Systems
Zooming interfaces!: enhancing the performance of eye controlled pointing devices
Proceedings of the fifth international ACM conference on Assistive technologies
Eye gaze interaction with expanding targets
CHI '04 Extended Abstracts on Human Factors in Computing Systems
EyeWindows: evaluation of eye-controlled zooming windows for focus selection
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Efficient eye pointing with a fisheye lens
GI '05 Proceedings of Graphics Interface 2005
EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Dynamics of tilt-based browsing on mobile devices
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Noise tolerant selection by gaze-controlled pan and zoom in 3D
Proceedings of the 2008 symposium on Eye tracking research & applications
Improving eye cursor's stability for eye pointing tasks
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Evaluation of gaze-added target selection methods suitable for general GUIs
International Journal of Computer Applications in Technology
The inspection of very large images by eye-gaze control
AVI '08 Proceedings of the working conference on Advanced visual interfaces
The MAGIC Touch: Combining MAGIC-Pointing with a Touch-Sensitive Mouse
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Eye and pointer coordination in search and selection tasks
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Small-target selection with gaze alone
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
3D user interface combining gaze and hand gestures for large-scale display
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Gaze-based interaction with public displays using off-the-shelf components
Proceedings of the 12th ACM international conference adjunct papers on Ubiquitous computing - Adjunct
Designing gaze-supported multimodal interactions for the exploration of large image collections
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Comparison of gaze-to-objects mapping algorithms
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Designing gaze-based user interfaces for steering in virtual environments
Proceedings of the Symposium on Eye Tracking Research and Applications
Investigating gaze-supported multimodal pan and zoom
Proceedings of the Symposium on Eye Tracking Research and Applications
Gaze interaction in the post-WIMP world
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Gaze tracking and non-touch gesture based interaction method for mobile 3D virtual spaces
Proceedings of the 24th Australian Computer-Human Interaction Conference
Can we beat the mouse with MAGIC?
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gaze interaction in the Post-WIMP world
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
Pursuit calibration: making gaze calibration less tedious and more flexible
Proceedings of the 26th annual ACM symposium on User interface software and technology
Cross-device eye-based interaction
Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology
Using eye movements to recognize activities on cartographic maps
Proceedings of the 21st ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems
Hi-index | 0.01 |
While eye tracking has a high potential for fast selection tasks, it is often regarded as error-prone and unnatural, especially for gaze-only interaction. To improve on that, we propose gaze-supported interaction as a more natural and effective way combining a user's gaze with touch input from a handheld device. In particular, we contribute a set of novel and practical gaze-supported selection techniques for distant displays. Designed according to the principle gaze suggests, touch confirms they include an enhanced gaze-directed cursor, local zoom lenses and more elaborated techniques utilizing manual fine positioning of the cursor via touch. In a comprehensive user study with 24 participants, we investigated the potential of these techniques for different target sizes and distances. All novel techniques outperformed a simple gaze-directed cursor and showed individual advantages. In particular those techniques using touch for fine cursor adjustments (MAGIC touch) and for cycling through a list of possible close-to-gaze targets (MAGIC tab) demonstrated a high overall performance and usability.