Eye and gaze tracking for interactive graphic display

  • Authors:
  • Zhiwei Zhu;Qiang Ji

  • Affiliations:
  • Department of Electrical, Computer, and Systems Engineering, Rensselaer Polytechnic Institute, JEC 6219, Troy, NY;Department of Electrical, Computer, and Systems Engineering, Rensselaer Polytechnic Institute, JEC 6219, Troy, NY

  • Venue:
  • Machine Vision and Applications
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a computer vision system based on active IR illumination for real-time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNs). With GRNNs, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a hierarchical classification scheme that deals with the classes that tend to be misclassified. This leads to a 10% improvement in classification error. The angular gaze accuracy is about 5° horizontally and 8° vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.