User-calibration-free remote gaze estimation system

  • Authors:
  • Dmitri Model;Moshe Eizenman

  • Affiliations:
  • University of Toronto;University of Toronto

  • Venue:
  • Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Gaze estimation systems use calibration procedures that require active subject participation to estimate the point-of-gaze accurately. In these procedures, subjects are required to fixate on a specific point or points in space at specific time instances. This paper describes a gaze estimation system that does not use calibration procedures that require active user participation. The system estimates the optical axes of both eyes using images from a stereo pair of video cameras without a personal calibration procedure. To estimate the point-of-gaze, which lies along the visual axis, the angles between the optical and visual axes are estimated by a novel automatic procedure that minimizes the distance between the intersections of the visual axes of the left and right eyes with the surface of a display while subjects look naturally at the display (e.g., watching a video clip). Experiments with four subjects demonstrate that the RMS error of this point-of-gaze estimation system is 1.3°.