What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Interacting with eye movements in virtual environments
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Effective eye-gaze input into Windows
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Twenty years of eye typing: systems and design issues
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
Eye and gaze tracking for interactive graphic display
Machine Vision and Applications
What you don't look at is what you get: anti-saccades can reduce the midas touch-problem
APGV '05 Proceedings of the 2nd symposium on Applied perception in graphics and visualization
A Fitts Law comparison of eye tracking and manual input in the selection of visual targets
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Wearable augmented reality system using gaze interaction
ISMAR '08 Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
Eye gaze tracking techniques for interactive applications
Computer Vision and Image Understanding - Special issue on eye detection and tracking
Inferring object relevance from gaze in dynamic scenes
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Small-target selection with gaze alone
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Only one Fitts' law formula please!
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Gaze gestures or dwell-based interaction?
Proceedings of the Symposium on Eye Tracking Research and Applications
Hi-index | 0.00 |
Eye gaze tracking provides a natural and fast method of interacting with computers. Many click alternatives have been proposed so far, each with their own merits and drawbacks. We focus on the most natural selection method, i.e. the dwell, with which a user can select an on-screen object by just gazing at it for a pre-defined dwell time. We have looked at three design parameters of the dwell click alternative, namely dwell time, button size and placement of content. Two experiments, with similar user interfaces, were designed and conducted with 21 and 15 participants, respectively. Different combinations of dwell times and button sizes were tested in each experiment for each participant. One experiment had content placed on the buttons to be gazed at, while the other had content placed above the buttons. One important finding is that moving the content outside the clickable areas avoids accidental clicking, i.e. the Midas Touch problem. In such a design, a combination of big buttons and short dwell times are most suited for maximizing accuracy and ease of use, due to a phenomenon identified as the 'gaze-hold' problem.