Inferring intent in eye-based interfaces: tracing eye movements with process models
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
An interactive model-based environment for eye-movement protocol analysis and visualization
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Twenty years of eye typing: systems and design issues
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
Shorthand writing on stylus keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Metrics for text entry research: an evaluation of MSD and KSPC, and a new unified error metric
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Phrase sets for evaluating text entry techniques
CHI '03 Extended Abstracts on Human Factors in Computing Systems
SHARK2: a large vocabulary shorthand writing system for pen-based computers
Proceedings of the 17th annual ACM symposium on User interface software and technology
Longitudinal evaluation of discrete consecutive gaze gestures for text entry
Proceedings of the 2008 symposium on Eye tracking research & applications
Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze
Proceedings of the 2008 symposium on Eye tracking research & applications
Fast gaze typing with an adjustable dwell time
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Performance comparisons of phrase sets and presentation styles for text entry evaluations
Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
ClickerAID: a tool for efficient clicking using intentional muscle contractions
Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility
A new interaction technique involving eye gaze tracker and scanning system
Proceedings of the 2013 Conference on Eye Tracking South Africa
Complementing text entry evaluations with a composition task
ACM Transactions on Computer-Human Interaction (TOCHI)
Hi-index | 0.00 |
We propose a new research direction for eye-typing which is potentially much faster: dwell-free eye-typing. Dwell-free eye-typing is in principle possible because we can exploit the high redundancy of natural languages to allow users to simply look at or near their desired letters without stopping to dwell on each letter. As a first step we created a system that simulated a perfect recognizer for dwell-free eye-typing. We used this system to investigate how fast users can potentially write using a dwell-free eye-typing interface. We found that after 40 minutes of practice, users reached a mean entry rate of 46 wpm. This indicates that dwell-free eye-typing may be more than twice as fast as the current state-of-the-art methods for writing by gaze. A human performance model further demonstrates that it is highly unlikely traditional eye-typing systems will ever surpass our dwell-free eye-typing performance estimate.