The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Text input methods for eye trackers using off-screen targets
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Effective eye-gaze input into Windows
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
EyeDraw: enabling children with severe motor impairments to draw with their eyes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gaze-enhanced scrolling techniques
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Snap clutch, a moded approach to solving the Midas touch problem
Proceedings of the 2008 symposium on Eye tracking research & applications
Spoken words: activating text-to-speech through eye closure
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Perspective change: a system for switching between on-screen views by closing one eye
AVI '08 Proceedings of the working conference on Advanced visual interfaces
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Designing gaze gestures for gaming: an investigation of performance
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Interacting with the computer using gaze gestures
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Gaze and voice controlled drawing
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Improving accuracy in face tracking user interfaces using consumer devices
Proceedings of the 1st Annual conference on Research in information technology
EyeSketch: a drawing application for gaze control
Proceedings of the 2013 Conference on Eye Tracking South Africa
Text entry by gazing and smiling
Advances in Human-Computer Interaction
Hi-index | 0.00 |
We created a set of gaze gestures that utilize the following three elements: simple one-segment gestures, off-screen space, and the closure of the eyes. These gestures are to be used as the moving tool in a gaze-only controlled drawing application. We tested our gaze gestures with 24 participants and analyzed the gesture durations, the accuracy of the stops, and the gesture performance. We found that the difference in gesture durations between short and long gestures was so small that there is no need to choose between them. The stops made by closing both eyes were accurate, and the input method worked well for this purpose. With some adjustments and with the possibility for personal settings, the gesture performance and the accuracy of the stops can become even better.