A Calibration-Free Gaze Tracking Technique
ICPR '00 Proceedings of the International Conference on Pattern Recognition - Volume 4
Remote point-of-gaze estimation requiring a single-point calibration for applications with infants
Proceedings of the 2008 symposium on Eye tracking research & applications
Calibration-free gaze tracking using a binocular 3D eye model
CHI '09 Extended Abstracts on Human Factors in Computing Systems
A novel approach to 3-D gaze tracking using stereo cameras
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A probabilistic approach for the estimation of angle kappa in infants
Proceedings of the Symposium on Eye Tracking Research and Applications
Augmenting the robustness of cross-ratio gaze tracking methods to head movement
Proceedings of the Symposium on Eye Tracking Research and Applications
Analysing the potential of adapting head-mounted eye tracker calibration to a new user
Proceedings of the Symposium on Eye Tracking Research and Applications
Proceedings of the Symposium on Eye Tracking Research and Applications
Improving Head Movement Tolerance of Cross-Ratio Based Eye Trackers
International Journal of Computer Vision
Realtime pointing gesture recognition and applications in multi-user interaction
ACIIDS'13 Proceedings of the 5th Asian conference on Intelligent Information and Database Systems - Volume Part I
Hi-index | 0.00 |
Gaze estimation systems use calibration procedures that require active subject participation to estimate the point-of-gaze accurately. In these procedures, subjects are required to fixate on a specific point or points in space at specific time instances. This paper describes a gaze estimation system that does not use calibration procedures that require active user participation. The system estimates the optical axes of both eyes using images from a stereo pair of video cameras without a personal calibration procedure. To estimate the point-of-gaze, which lies along the visual axis, the angles between the optical and visual axes are estimated by a novel automatic procedure that minimizes the distance between the intersections of the visual axes of the left and right eyes with the surface of a display while subjects look naturally at the display (e.g., watching a video clip). Experiments with four subjects demonstrate that the RMS error of this point-of-gaze estimation system is 1.3°.