An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
eyeLook: using attention to facilitate mobile media consumption
Proceedings of the 18th annual ACM symposium on User interface software and technology
Eye-gaze interaction for mobile phones
Mobility '07 Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology
In the Eye of the Beholder: A Survey of Models for Eyes and Gaze
IEEE Transactions on Pattern Analysis and Machine Intelligence
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
MobiGaze: development of a gaze interface for handheld mobile devices
CHI '10 Extended Abstracts on Human Factors in Computing Systems
EyePhone: activating mobile phones with your eyes
Proceedings of the second ACM SIGCOMM workshop on Networking, systems, and applications on mobile handhelds
Evaluation of a remote webcam-based eye tracker
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Hi-index | 0.00 |
This paper investigates whether it is feasible to interact with the small screen of a smartphone using eye movements only. Two of the most common gaze-based selection strategies, dwell time selections and gaze gestures are compared in a target selection experiment. Finger-strokes and accelerometer-based interaction, i. e. tilting, are also considered. In an experiment with 11 subjects we found gaze interaction to have a lower performance than touch interaction but comparable to the error rate and completion time of accelerometer (i.e. tilt) interaction. Gaze gestures had a lower error rate and were faster than dwell selections by gaze, especially for small targets, suggesting that this method may be the best option for hands-free gaze control of smartphones.