What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
INTERCHI '93 Proceedings of the INTERCHI '93 conference on Human factors in computing systems
Cirrin: a word-level unistroke keyboard for pen input
Proceedings of the 11th annual ACM symposium on User interface software and technology
Twenty years of eye typing: systems and design issues
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion
Proceedings of the 16th annual ACM symposium on User interface software and technology
eyeLook: using attention to facilitate mobile media consumption
Proceedings of the 18th annual ACM symposium on User interface software and technology
Keystroke-level model for advanced mobile phone interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Evaluation of eye-gaze interaction methods for security enhanced PIN-entry
OZCHI '07 Proceedings of the 19th Australasian conference on Computer-Human Interaction: Entertaining User Interfaces
Modeling dwell-based eye pointing target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gazemarks: gaze-based visual placeholders to ease attention switching
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
MobiGaze: development of a gaze interface for handheld mobile devices
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Effects of different visual feedback forms on eye cursor's stabilities
IDGD'11 Proceedings of the 4th international conference on Internationalization, design and global development
Analysing EOG signal features for the discrimination of eye movements with wearable devices
Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction
Speed-accuracy trade-off in dwell-based eye pointing tasks at different cognitive levels
Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction
Gaze input for mobile devices by dwell and gestures
Proceedings of the Symposium on Eye Tracking Research and Applications
Reading and estimating gaze on smart phones
Proceedings of the Symposium on Eye Tracking Research and Applications
Emerging Input Technologies for Always-Available Mobile Interaction
Foundations and Trends in Human-Computer Interaction
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Eye gesture recognition on portable devices
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Visual search on a mobile device while walking
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Comparing scanning behaviour in web search on small and large screens
Proceedings of the Seventeenth Australasian Document Computing Symposium
Estimating and using absolute and relative viewing distance in interactive systems
Pervasive and Mobile Computing
Hi-index | 0.00 |
In this paper, we discuss the use of eye-gaze tracking technology for mobile phones. In particular we investigate how gaze interaction can be used to control applications on handheld devices. In contrast to eye-tracking systems for desktop computers, mobile devices imply several problems like the intensity of light for outdoor use and calibration issues. Therefore, we compared two different approaches for controlling mobile phones with the eyes: standard eye-gaze interaction based on the dwell-time method and gaze gestures. Gaze gestures are a new concept, which we think has the potential to overcome many of these problems. We conducted a user study to see whether people are able to interact with applications using these approaches. The results confirm that eye-gaze interaction for mobile phones is attractive for the users and that the gaze gestures are an alternative method for eye-gaze based interaction.