What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Device independent text input: a rationale and an example
AVI '00 Proceedings of the working conference on Advanced visual interfaces
Text input methods for eye trackers using off-screen targets
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
The Psychology of Human-Computer Interaction
The Psychology of Human-Computer Interaction
Metrics for text entry research: an evaluation of MSD and KSPC, and a new unified error metric
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion
Proceedings of the 16th annual ACM symposium on User interface software and technology
EyeWindows: evaluation of eye-controlled zooming windows for focus selection
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Efficient eye pointing with a fisheye lens
GI '05 Proceedings of Graphics Interface 2005
Trackball text entry for people with motor impairments
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
From letters to words: efficient stroke-based word completion for trackball text entry
Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility
Analyzing the input stream for character- level errors in unconstrained text entry evaluations
ACM Transactions on Computer-Human Interaction (TOCHI)
Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze
PIT '08 Proceedings of the 4th IEEE tutorial and research workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multimodal Dialogue Systems
A framework for gaze selection techniques
Proceedings of the 2008 annual research conference of the South African Institute of Computer Scientists and Information Technologists on IT research in developing countries: riding the wave of technology
Fast gaze typing with an adjustable dwell time
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Wearable EOG goggles: eye-based interaction in everyday environments
CHI '09 Extended Abstracts on Human Factors in Computing Systems
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Look into my eyes!: can you guess my password?
Proceedings of the 5th Symposium on Usable Privacy and Security
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Context switching for fast key selection in text entry applications
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Alternatives to single character entry and dwell time selection on eye typing
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Designing gaze gestures for gaming: an investigation of performance
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Real time eye movement identification protocol
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Object selection in gaze controlled systems: What you don't look at is what you get
ACM Transactions on Applied Perception (TAP)
Gliding and saccadic gaze gesture recognition in real time
ACM Transactions on Interactive Intelligent Systems (TiiS)
The potential of dwell-free eye-typing for fast assistive gaze communication
Proceedings of the Symposium on Eye Tracking Research and Applications
Measuring the performance of gaze and speech for text input
Proceedings of the Symposium on Eye Tracking Research and Applications
Typing with eye-gaze and tooth-clicks
Proceedings of the Symposium on Eye Tracking Research and Applications
Dynamic context switching for gaze based interaction
Proceedings of the Symposium on Eye Tracking Research and Applications
An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Low cost remote gaze gesture recognition in real time
Applied Soft Computing
Comparison of video-based pointing and selection techniques for hands-free text entry
Proceedings of the International Working Conference on Advanced Visual Interfaces
Enhanced gaze interaction using simple head gestures
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Evaluating an eye tracking interface for a two-dimensional sketch editor
Computer-Aided Design
Text entry by gazing and smiling
Advances in Human-Computer Interaction
Hi-index | 0.00 |
Eye-typing performance results are reported from controlled studies comparing an on-screen keyboard and Eye Write, a new on-screen gestural input alternative. Results from the first pilot study suggest the presence of a learning curve that novice users must overcome in order to gain proficiency in EyeWrite's use (requiring practice with its letter-like gestural alphabet). Results from the second longitudinal study indicate that EyeWrite's inherent multi-saccade handicap (4.52 saccades per character, frequency-weighted average) is sufficient for the on-screen keyboard to edge out Eye Write in speed performance. Eye-typing speeds with Eye Write approach 5 wpm on average (8 wpm attainable by proficient users), whereas keyboard users achieve about 7 wpm on average (in line with previous results). However, Eye Write users leave significantly fewer uncorrected errors in the final text, with no significant difference in the number of errors corrected during entry, indicating a speed-accuracy trade-off. Subjective results indicate that participants consider Eye Write significantly faster, easier to use, and prone to cause less ocular fatigue than the on-screen keyboard. In addition, Eye-Write consumes much less screen real-estate than an on-screen keyboard, giving it practical advantages for eye-based text entry.