What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
External cognition: how do graphical representations work?
International Journal of Human-Computer Studies
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Intelligent gaze-added interfaces
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Design issues of iDICT: a gaze-assisted translation aid
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Effective eye-gaze input into Windows
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Partitioning digital worlds: focal and peripheral awareness in multiple monitor use
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Eye Tracking Methodology: Theory and Practice
Eye Tracking Methodology: Theory and Practice
Gaze-orchestrated dynamic windows
SIGGRAPH '81 Proceedings of the 8th annual conference on Computer graphics and interactive techniques
"Constant, constant, multi-tasking craziness": managing multiple working spheres
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the working conference on Advanced visual interfaces
EyeWindows: evaluation of eye-controlled zooming windows for focus selection
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
No task left behind?: examining the nature of fragmented work
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Combining head tracking and mouse input for a GUI on multiple monitors
CHI '05 Extended Abstracts on Human Factors in Computing Systems
CHI '05 Extended Abstracts on Human Factors in Computing Systems
The Large-Display User Experience
IEEE Computer Graphics and Applications
Efficient eye pointing with a fisheye lens
GI '05 Proceedings of Graphics Interface 2005
Use of eye movements for video game control
Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology
OZCHI '06 Proceedings of the 18th Australia conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments
EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Disruption and recovery of computing tasks: field study, analysis, and directions
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Lightweight task/application performance using single versus multiple monitors: a comparative study
GI '08 Proceedings of graphics interface 2008
Eye-gaze interaction for mobile phones
Mobility '07 Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology
Human-Computer Interaction
GazeSpace: eye gaze controlled content spaces
BCS-HCI '07 Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...but not as we know it - Volume 2
Interacting with the computer using gaze gestures
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Making use of drivers' glances onto the screen for explicit gaze-based interaction
Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Using eye tracking for interaction
CHI '11 Extended Abstracts on Human Factors in Computing Systems
MouseHints: easing task switching in parallel browsing
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Visual separation in mobile multi-display environments
Proceedings of the 24th annual ACM symposium on User interface software and technology
Computer Supported Cooperative Work
Support for modeling interaction with automotive user interfaces
Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
GeoGazemarks: providing gaze history for the orientation on small display maps
Proceedings of the 14th ACM international conference on Multimodal interaction
Subtle gaze-dependent techniques for visualising display changes in multi-display environments
Proceedings of the 2013 international conference on Intelligent user interfaces
Casual video watching during sensor guided navigation
Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry
The influence of gaze history visualization on map interaction sequences and cognitive maps
Proceedings of the 1st ACM SIGSPATIAL International Workshop on MapInteraction
Hi-index | 0.01 |
Many tasks require attention switching. For example, searching for information on one sheet of paper and then entering this information onto another one. With paper we see that people use fingers or objects as placeholders. Using these simple aids, the process of switching attention between displays can be simplified and speeded up. With large or multiple visual displays we have many tasks where both attention areas are on the screen and where using a finger as a placeholder is not suitable. One way users deal with this is to use the mouse and highlight their current focus. However, this also has its limitations -- in particular in environments where there is no pointing device. Our approach is to utilize the user's gaze position to provide a visual placeholder. The last area where a user fixated on the screen (before moving their attention away) is highlighted; we call this visual reminder a Gazemark. Gazemarks ease orientation and the resumption of the interrupted task when coming back to this display. In this paper we report on a study where the effectiveness of using Gazemarks was investigated, in particular we show how they can ease attention switching. Our results show faster completion times for a resumed simple visual search task when using this technique. The paper analyzes relevant parameters for the implementation of Gazemarks and discusses some further application areas for this approach.