Accurate Head Pose Tracking in Low Resolution Video
FGR '06 Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
ENCARA2: Real-time detection of multiple faces at different resolutions in video streams
Journal of Visual Communication and Image Representation
Eye-gaze interaction for mobile phones
Mobility '07 Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology
MobiGaze: development of a gaze interface for handheld mobile devices
CHI '10 Extended Abstracts on Human Factors in Computing Systems
EyePhone: activating mobile phones with your eyes
Proceedings of the second ACM SIGCOMM workshop on Networking, systems, and applications on mobile handhelds
Toward Mobile Eye-Based Human-Computer Interaction
IEEE Pervasive Computing
Discrimination of gaze directions using low-level eye image features
Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction
Hi-index | 0.00 |
Hand-held portable devices have received only little attention as a platform in the eye tracking community so far. This is mainly due to their -- until recently -- limited sensing capabilities and processing power. In this work-in-progress paper we present the first prototype eye gesture recognition system for portable devices that does not require any additional equipment. The system combines techniques from image processing, computer vision and pattern recognition to detect eye gestures in the video recorded using the built-in front-facing camera. In a five-participant user study we show that our prototype can recognise four different continuous eye gestures in near real-time with an average accuracy of 60% on an Android-based smartphone (17.6% false positives) and 67.3% on a laptop (5.9% false positives). This initial result is promising and underlines the potential of eye tracking and eye-based interaction on portable devices.