The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Designing the user interface (2nd ed.): strategies for effective human-computer interaction
Designing the user interface (2nd ed.): strategies for effective human-computer interaction
Stretching the rubber sheet: a metaphor for viewing large layouts on small screens
UIST '93 Proceedings of the 6th annual ACM symposium on User interface software and technology
The limits of expert performance using hierarchic marking menus
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
Effective eye-gaze input into Windows
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Twenty years of eye typing: systems and design issues
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
A character-level error analysis technique for evaluating text entry methods
Proceedings of the second Nordic conference on Human-computer interaction
Zooming interfaces!: enhancing the performance of eye controlled pointing devices
Proceedings of the fifth international ACM conference on Assistive technologies
Gaze-orchestrated dynamic windows
SIGGRAPH '81 Proceedings of the 8th annual conference on Computer graphics and interactive techniques
A Calibration-Free Gaze Tracking Technique
ICPR '00 Proceedings of the International Conference on Pattern Recognition - Volume 4
A free-head, simple calibration, gaze tracking system that enables gaze-based interaction
Proceedings of the 2004 symposium on Eye tracking research & applications
Gaze typing compared with input by head and hand
Proceedings of the 2004 symposium on Eye tracking research & applications
Eye gaze interaction with expanding targets
CHI '04 Extended Abstracts on Human Factors in Computing Systems
EyeWindows: evaluation of eye-controlled zooming windows for focus selection
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Computer Vision and Image Understanding - Special issue on eye detection and tracking
Efficient eye pointing with a fisheye lens
GI '05 Proceedings of Graphics Interface 2005
openEyes: a low-cost head-mounted eye-tracking solution
Proceedings of the 2006 symposium on Eye tracking research & applications
Effects of feedback and dwell time on eye typing speed and accuracy
Universal Access in the Information Society
Eye Tracking Methodology: Theory and Practice
Eye Tracking Methodology: Theory and Practice
Low-cost gaze pointing and EMG clicking
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Low-cost gaze interaction: ready to deliver the promises
CHI '09 Extended Abstracts on Human Factors in Computing Systems
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Evaluation of a low-cost open-source gaze tracker
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
An open source eye-gaze interface: expanding the adoption of eye-gaze in everyday applications
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Small-target selection with gaze alone
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Context switching for fast key selection in text entry applications
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Alternatives to single character entry and dwell time selection on eye typing
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Exploring camera viewpoint control models for a multi-tasking setting in teleoperation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Designing gaze-supported multimodal interactions for the exploration of large image collections
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Designing gaze-based user interfaces for steering in virtual environments
Proceedings of the Symposium on Eye Tracking Research and Applications
Typing with eye-gaze and tooth-clicks
Proceedings of the Symposium on Eye Tracking Research and Applications
Dynamic context switching for gaze based interaction
Proceedings of the Symposium on Eye Tracking Research and Applications
Investigating gaze-supported multimodal pan and zoom
Proceedings of the Symposium on Eye Tracking Research and Applications
Gaming with gaze and losing with a smile
Proceedings of the Symposium on Eye Tracking Research and Applications
Look & touch: gaze-supported target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Enhanced gaze interaction using simple head gestures
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
Design and evaluation of 3D selection techniques based on progressive refinement
International Journal of Human-Computer Studies
Hi-index | 0.00 |
This paper presents StarGazer - a new 3D interface for gaze-based interaction and target selection using continuous pan and zoom. Through StarGazer we address the issues of interacting with graph structured data and applications (i.e. gaze typing systems) using low resolution eye trackers or small-size displays. We show that it is possible to make robust selection even with a large number of selectable items on the screen and noisy gaze trackers. A test with 48 subjects demonstrated that users who have never tried gaze interaction before could rapidly adapt to the navigation principles of StarGazer. We tested three different display sizes (down to PDA-sized displays) and found that large screens are faster to navigate than small displays and that the error rate is higher for the smallest display. Half of the subjects were exposed to severe noise deliberately added on the cursor positions. We found that this had a negative impact on efficiency. However, the user remained in control and the noise did not seem to effect the error rate. Additionally, three subjects tested the effects of temporally adding noise to simulate latency in the gaze tracker. Even with a significant latency (about 200 ms) the subjects were able to type at acceptable rates. In a second test, seven subjects were allowed to adjust the zooming speed themselves. They achieved typing rates of more than eight words per minute without using language modeling. We conclude that the StarGazer application is an intuitive 3D interface for gaze navigation, allowing more selectable objects to be displayed on the screen than the accuracy of the gaze trackers would otherwise permit.