The affective reasoner: a process model of emotions in a multi-agent system
The affective reasoner: a process model of emotions in a multi-agent system
Affective computing
Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Messages embedded in gaze of interface agents --- impression management with agent's gaze
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The C++ Programming Language, Third Edition
The C++ Programming Language, Third Edition
OpenGL Programming Guide: The Official Guide to Learning OpenGL, Version 1.2
OpenGL Programming Guide: The Official Guide to Learning OpenGL, Version 1.2
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Animation of Synthetic Faces in MPEG-4
CA '98 Proceedings of the Computer Animation
Xface: MPEG-4 based open source toolkit for 3D Facial Animation
Proceedings of the working conference on Advanced visual interfaces
Foundations of human computing: facial expression and emotion
Proceedings of the 8th international conference on Multimodal interfaces
Programming in Lua, Second Edition
Programming in Lua, Second Edition
Eye communication in a conversational 3D synthetic agent
AI Communications
Visual attention and eye gaze during multiparty conversations with distractions
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
A saliency-based method of simulating visual attention in virtual scenes
Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology
Computers in Entertainment (CIE) - SPECIAL ISSUE: Games
Realistic emotional gaze and head behavior generation based on arousal and dominance factors
MIG'10 Proceedings of the Third international conference on Motion in games
Short paper: exploring the object relevance of a gaze animation model
EGVE - JVRC'11 Proceedings of the 17th Eurographics conference on Virtual Environments & Third Joint Virtual Reality
Hi-index | 0.00 |
Eyes play an important role in communication among people. Motions of the eye express emotions and regulate the flow of conversation. Hence we consider fundamental that virtual humans or other characters present convincing and expressive gaze in applications such as Embodied Conversational Agents (ECAs), games and movies. However, we perceive that in many applications that require automatic generation of facial movements, such as ECA, character's eye motion does not carry meaning related to its expressiveness. This work proposes a model for the automatic generation of expressive gaze by examining eye behavior in different affective states. To collect data related to gaze and expressiveness, we looked at Computer Graphics movies. This data was used as a basis to describe gaze expressions in the proposed model. We also implemented a prototype and performed some tests with users in order to observe the impact of eye behavior during some expressions of emotion. The results show that the model is capable of generating eye motions that are coherent with the affective states of the virtual character.