What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The Eyes Have It: A Task by Data Type Taxonomy for Information Visualizations
VL '96 Proceedings of the 1996 IEEE Symposium on Visual Languages
Eye gaze interaction with expanding targets
CHI '04 Extended Abstracts on Human Factors in Computing Systems
EyeWindows: evaluation of eye-controlled zooming windows for focus selection
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Efficient eye pointing with a fisheye lens
GI '05 Proceedings of Graphics Interface 2005
Distant freehand pointing and clicking on very large, high resolution displays
Proceedings of the 18th annual ACM symposium on User interface software and technology
Supporting multi-point interaction in visual workspaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Noise tolerant selection by gaze-controlled pan and zoom in 3D
Proceedings of the 2008 symposium on Eye tracking research & applications
Snap clutch, a moded approach to solving the Midas touch problem
Proceedings of the 2008 symposium on Eye tracking research & applications
Improving eye cursor's stability for eye pointing tasks
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The inspection of very large images by eye-gaze control
AVI '08 Proceedings of the working conference on Advanced visual interfaces
A review of overview+detail, zooming, and focus+context interfaces
ACM Computing Surveys (CSUR)
Gaze-augmented manual interaction
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Natural throw and tilt interaction between mobile phones and distant displays
CHI '09 Extended Abstracts on Human Factors in Computing Systems
3D user interface combining gaze and hand gestures for large-scale display
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Designing gaze-based user interfaces for steering in virtual environments
Proceedings of the Symposium on Eye Tracking Research and Applications
Dynamic context switching for gaze based interaction
Proceedings of the Symposium on Eye Tracking Research and Applications
Investigating gaze-supported multimodal pan and zoom
Proceedings of the Symposium on Eye Tracking Research and Applications
Look & touch: gaze-supported target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gaze interaction in the post-WIMP world
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Gaze tracking and non-touch gesture based interaction method for mobile 3D virtual spaces
Proceedings of the 24th Australian Computer-Human Interaction Conference
Facilitating gaze interaction using the gap and overlap effects
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gaze-supported foot interaction in zoomable information spaces
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Gaze interaction in the Post-WIMP world
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces
Development of a mobile tablet PC with gaze-tracking function
HCI International'13 Proceedings of the 15th international conference on Human Interface and the Management of Information: information and interaction for health, safety, mobility and complex environments - Volume Part II
Hi-index | 0.00 |
While eye tracking is becoming more and more relevant as a promising input channel, diverse applications using gaze control in a more natural way are still rather limited. Though several researchers have indicated the particular high potential of gaze-based interaction for pointing tasks, often gaze-only approaches are investigated. However, time-consuming dwell-time activations limit this potential. To overcome this, we present a gaze-supported fisheye lens in combination with (1) a keyboard and (2) and a tilt-sensitive mobile multi-touch device. In a user-centered design approach, we elicited how users would use the aforementioned input combinations. Based on the received feedback we designed a prototype system for the interaction with a remote display using gaze and a touch-and-tilt device. This eliminates gaze dwell-time activations and the well-known Midas Touch problem (unintentionally issuing an action via gaze). A formative user study testing our prototype provided further insights into how well the elaborated gaze-supported interaction techniques were experienced by users.