Self-calibration of an affine camera from multiple views
International Journal of Computer Vision
Just blink your eyes: a head-free gaze tracking system
CHI '03 Extended Abstracts on Human Factors in Computing Systems
An Algorithm for Real-Time Stereo Vision Implementation of Head Pose and Gaze Direction Measurement
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
A Paraperspective Factorization Method for Shape and Motion Recovery
A Paraperspective Factorization Method for Shape and Motion Recovery
Non-Intrusive Gaze Tracking Using Artificial Neural Networks
Non-Intrusive Gaze Tracking Using Artificial Neural Networks
Eye gaze tracking techniques for interactive applications
Computer Vision and Image Understanding - Special issue on eye detection and tracking
Recognizing gaze aversion gestures in embodied conversational discourse
Proceedings of the 8th international conference on Multimodal interfaces
IEICE - Transactions on Information and Systems
Gazecoppet: hierarchical gaze-communication in ambient space
ACM SIGGRAPH 2007 posters
Proceedings of the 9th international conference on Multimodal interfaces
Conic-based algorithm for visual line estimation from one image
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
Gaze estimation from low resolution images
PSIVT'06 Proceedings of the First Pacific Rim conference on Advances in Image and Video Technology
Proceedings of the 3rd ACM International Workshop on Context-Awareness for Self-Managing Systems
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Recommendation from robots in a real-world retail shop
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
A wearable gaze tracking system for children in unconstrained environments
Computer Vision and Image Understanding
ACCV'10 Proceedings of the 2010 international conference on Computer vision - Volume Part I
Gaze tracking in wide area using multiple camera observations
Proceedings of the Symposium on Eye Tracking Research and Applications
Tracking eye gaze under coordinated head rotations with an ordinary camera
ACCV'09 Proceedings of the 9th Asian conference on Computer Vision - Volume Part II
Development of a mobile tablet PC with gaze-tracking function
HCI International'13 Proceedings of the 15th international conference on Human Interface and the Management of Information: information and interaction for health, safety, mobility and complex environments - Volume Part II
Learning gaze biases with head motion for head pose-free gaze estimation
Image and Vision Computing
Hi-index | 0.00 |
We propose a real-time gaze estimation method based on facial-feature tracking using a single video camera that does not require any special user action for calibration. Many gaze estimation methods have been already proposed; however, most conventional gaze tracking algorithms can only be applied to experimental environments due to their complex calibration procedures and lacking of usability. In this paper, we propose a gaze estimation method that can apply to daily-life situations. Gaze directions are determined as 3D vectors connecting both the eyeball and the iris centers. Since the eyeball center and radius cannot be directly observed from images, the geometrical relationship between the eyeball centers and the facial features and eyeball radius (face/eye model) are calculated in advance. Then, the 2D positions of the eyeball centers can be determined by tracking the facial features. While conventional methods require instructing users to perform such special actions as looking at several reference points in the calibration process, the proposed method does not require such special calibration action of users and can be realized by combining 3D eye-model-based gaze estimation and circle-based algorithms for eye-model calibration. Experimental results show that the gaze estimation accuracy of the proposed method is 5° horizontally and 7° vertically. With our proposed method, various application such as gaze-communication robots, gaze-based interactive signboards, etc. that require gaze information in daily-life situations are possible.