Learning gaze biases with head motion for head pose-free gaze estimation

  • Authors:
  • Feng Lu;Takahiro Okabe;Yusuke Sugano;Yoichi Sato

  • Affiliations:
  • The University of Tokyo, Japan;Kyushu Institute of Technology, Japan;The University of Tokyo, Japan;The University of Tokyo, Japan

  • Venue:
  • Image and Vision Computing
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

When estimating human gaze directions from captured eye appearances, most existing methods assume a fixed head pose because head motion changes eye appearance greatly and makes the estimation inaccurate. To handle this difficult problem, in this paper, we propose a novel method that performs accurate gaze estimation without restricting the user's head motion. The key idea is to decompose the original free-head motion problem into subproblems, including an initial fixed head pose problem and subsequent compensations to correct the initial estimation biases. For the initial estimation, automatic image rectification and joint alignment with gaze estimation are introduced. Then compensations are done by either learning-based regression or geometric-based calculation. The merit of using such a compensation strategy is that the training requirement to allow head motion is not significantly increased; only capturing a 5-s video clip is required. Experiments are conducted, and the results show that our method achieves an average accuracy of around 3^o by using only a single camera.