Automated Eye Motion Using Texture Synthesis

  • Authors:
  • Zhigang Deng;J. P. Lewis;Ulrich Neumann

  • Affiliations:
  • University of Southern California;University of Southern California;University of Southern California

  • Venue:
  • IEEE Computer Graphics and Applications
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Modeling and animating human eyes requires special care, because, as the "windows to the soul", the eyes are particularly scrutinized by human observers.Our goal in this article is to simultaneously synthesize realistic eye gaze and blink motion, accounting for any possible correlations between the two. This problem of synthesizing signals that appear similar (but not identical) to a given sample is essentially the same problem as texture synthesis, but in a one-dimensional (vector) context. We demonstrate that texture synthesis methods can be applied to this animation problem, providing an effective method for capturing both perceptible movement and blink statistics, and any correlations between them. The resulting method is simple to implement yet produces life-like and lively eye motion for applications where automated movement (for example, for game characters) or voiceless eye motions (such as listening avatars) is a requirement.