What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Gaze-orchestrated dynamic windows
SIGGRAPH '81 Proceedings of the 8th annual conference on Computer graphics and interactive techniques
Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality
Proceedings of the 5th international conference on Multimodal interfaces
The vacuum: facilitating the manipulation of distant objects
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Efficient eye pointing with a fisheye lens
GI '05 Proceedings of Graphics Interface 2005
Distant freehand pointing and clicking on very large, high resolution displays
Proceedings of the 18th annual ACM symposium on User interface software and technology
EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Evaluation of gaze-added target selection methods suitable for general GUIs
International Journal of Computer Applications in Technology
The MAGIC Touch: Combining MAGIC-Pointing with a Touch-Sensitive Mouse
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Efficient 3D Pointing Selection in Cluttered Virtual Environments
IEEE Computer Graphics and Applications
Eye and pointer coordination in search and selection tasks
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Touch projector: mobile interaction through video
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
3D user interface combining gaze and hand gestures for large-scale display
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Remote interaction for 3D manipulation
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Toward Mobile Eye-Based Human-Computer Interaction
IEEE Pervasive Computing
Development of eye-tracking tabletop interface for media art works
ACM International Conference on Interactive Tabletops and Surfaces
Mid-air pan-and-zoom on wall-sized displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Designing gaze-supported multimodal interactions for the exploration of large image collections
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Combining gaze with manual interaction to extend physical reach
Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction
Look & touch: gaze-supported target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
International Journal of Human-Computer Studies
Cross-device eye-based interaction
Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology
Eye drop: an interaction concept for gaze-supported point-to-point content transfer
Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia
Hi-index | 0.01 |
We investigate how to seamlessly bridge the gap between users and distant displays for basic interaction tasks, such as object selection and manipulation. For this, we take advantage of very fast and implicit, yet imprecise gaze- and head-directed input in combination with ubiquitous smartphones for additional manual touch control. We have carefully elaborated two novel and consistent sets of gaze-supported interaction techniques based on touch-enhanced gaze pointers and local magnification lenses. These conflict-free sets allow for fluently selecting and positioning distant targets. Both sets were evaluated in a user study with 16 participants. Overall, users were fastest with a touch-enhanced gaze pointer for selecting and positioning an object after some training. While the positive user feedback for both sets suggests that our proposed gaze- and head-directed interaction techniques are suitable for a convenient and fluent selection and manipulation of distant targets, further improvements are necessary for more precise cursor control.