SIGGRAPH '94 Proceedings of the 21st annual conference on Computer graphics and interactive techniques
Where to look? Automating attending behaviors of virtual human characters
Proceedings of the third annual conference on Autonomous Agents
Identifying fixations and saccades in eye-tracking protocols
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
BEAT: the Behavior Expression Animation Toolkit
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Real-time texture synthesis by patch-based sampling
ACM Transactions on Graphics (TOG)
Motion texture: a two-level statistical model for character motion synthesis
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Interactive motion generation from examples
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Texture Synthesis by Non-Parametric Sampling
ICCV '99 Proceedings of the International Conference on Computer Vision-Volume 2 - Volume 2
Expressive Facial Animation Synthesis by Learning Speech Coarticulation and Expression Spaces
IEEE Transactions on Visualization and Computer Graphics
The Relation between Gaze Behavior and the Attribution of Emotion: An Empirical Study
IVA '08 Proceedings of the 8th international conference on Intelligent Virtual Agents
On uniform resampling and gaze analysis of bidirectional texture functions
ACM Transactions on Applied Perception (TAP)
Perceptually guided expressive facial animation
Proceedings of the 2008 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
Glances, glares, and glowering: how should a virtual human express emotion through gaze?
Autonomous Agents and Multi-Agent Systems
Modeling and animating eye blinks
ACM Transactions on Applied Perception (TAP)
Modeling gaze behavior for virtual demonstrators
IVA'11 Proceedings of the 10th international conference on Intelligent virtual agents
An example-based motion synthesis technique for locomotion and object manipulation
I3D '12 Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games
Designing effective gaze mechanisms for virtual agents
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A head-eye coordination model for animating gaze shifts of virtual characters
Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
Analysis of human gaze interactions with texture and shape
MUSCLE'11 Proceedings of the 2011 international conference on Computational Intelligence for Multimedia Understanding
Hi-index | 0.00 |
Modeling and animating human eyes requires special care, because, as the "windows to the soul", the eyes are particularly scrutinized by human observers.Our goal in this article is to simultaneously synthesize realistic eye gaze and blink motion, accounting for any possible correlations between the two. This problem of synthesizing signals that appear similar (but not identical) to a given sample is essentially the same problem as texture synthesis, but in a one-dimensional (vector) context. We demonstrate that texture synthesis methods can be applied to this animation problem, providing an effective method for capturing both perceptible movement and blink statistics, and any correlations between them. The resulting method is simple to implement yet produces life-like and lively eye motion for applications where automated movement (for example, for game characters) or voiceless eye motions (such as listening avatars) is a requirement.