Implicit Calibration of a Remote Gaze Tracker

  • Authors:
  • Xavier L. C. Brolly;Jeffrey B. Mulligan

  • Affiliations:
  • NASA Ames Research Center;NASA Ames Research Center

  • Venue:
  • CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 8 - Volume 08
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe a system designed to monitor the gaze of a user working naturally at a computer workstation. The system consists of three cameras situated between the keyboard and the monitor. Free head movements are allowed within a three-dimensional volume approximately 40 centimeters in diameter. Two fixed, wide-field "face" cameras equipped with active-illumination systems enable rapid localization of the subject's pupils. A third steerable "eye" camera has a relatively narrow field of view, and acquires the images of the eyes which are used for gaze estimation. Unlike previous approaches which construct an explicit three-dimensional representation of the subject's head and eye, we derive mappings for steering control and gaze estimation using a procedure we call implicit calibration. Implicit calibration is performed by collecting a "training set" of parameters and associated measurements, and solving for a set of coefficients relating the measurements back to the parameters of interest. Preliminary data on three subjects indicate an median gaze estimation error of ap-proximately 0.8 degree.