What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Inferring intent in eye-based interfaces: tracing eye movements with process models
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Dynamic stereograms based on eye convergence for displaying multilayered images
SIGGRAPH Asia 2012 Emerging Technologies
Hi-index | 0.00 |
A gaze input interface offers hands-free operation by using the view-point position as the cursor coordinates on the display. However, the selection operation of a button is indistinguishable from viewing; this is known as the Midas touch problem. We propose a new input method that measures divergence eye movement, thereby enabling users to "press" a button by moving their viewpoint forward. Comparison of our method and the conventional blinking input method confirms that input speed and accuracy are similar.