A gaze-responsive self-disclosing display
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Identifying fixations and saccades in eye-tracking protocols
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The impact of eye gaze on communication using humanoid avatars
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Messages embedded in gaze of interface agents --- impression management with agent's gaze
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Where to look: a study of human-robot engagement
Proceedings of the 9th international conference on Intelligent user interfaces
Towards a model of face-to-face grounding
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
A model of attention and interest using Gaze behavior
Lecture Notes in Computer Science
Unscripted Narrative for Affectively Driven Characters
IEEE Computer Graphics and Applications
Facial performance capture and expressive translation for King Kong
ACM SIGGRAPH 2006 Sketches
Recognizing gaze aversion gestures in embodied conversational discourse
Proceedings of the 8th international conference on Multimodal interfaces
IEICE - Transactions on Information and Systems
Presence and engagement in an interactive drama
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
EmoVoice -- A Framework for Online Recognition of Emotions from Voice
PIT '08 Proceedings of the 4th IEEE tutorial and research workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multimodal Dialogue Systems
IGaze: Studying Reactive Gaze Behavior in Semi-immersive Human-Avatar Interactions
IVA '08 Proceedings of the 8th international conference on Intelligent Virtual Agents
Proceedings of the 2008 ACM conference on Computer supported cooperative work
Simplified facial animation control utilizing novel input devices: a comparative study
Proceedings of the 14th international conference on Intelligent user interfaces
Emotional input for character-based interactive storytelling
Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems - Volume 1
But that was in another country: agents and intercultural empathy
Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems - Volume 1
IVA '09 Proceedings of the 9th International Conference on Intelligent Virtual Agents
IVA '09 Proceedings of the 9th International Conference on Intelligent Virtual Agents
Modeling gaze behavior for virtual demonstrators
IVA'11 Proceedings of the 10th international conference on Intelligent virtual agents
Adaptive eye gaze patterns in interactions with human and artificial agents
ACM Transactions on Interactive Intelligent Systems (TiiS)
Towards building a virtual counselor: modeling nonverbal behavior during intimate self-disclosure
Proceedings of the 11th International Conference on Autonomous Agents and Multiagent Systems - Volume 1
Designing engagement-aware agents for multiparty conversations
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
In this paper, we investigate the user's eye gaze behavior during the conversation with an interactive storytelling application. We present an interactive eye gaze model for embodied conversational agents in order to improve the experience of users participating in Interactive Storytelling. The underlying narrative in which the approach was tested is based on a classical XIXth century psychological novel: Madame Bovary, by Flaubert. At various stages of the narrative, the user can address the main character or respond to her using free-style spoken natural language input, impersonating her lover. An eye tracker was connected to enable the interactive gaze model to respond to user's current gaze (i.e. looking into the virtual character's eyes or not). We conducted a study with 19 students where we compared our interactive eye gaze model with a non-interactive eye gaze model that was informed by studies of human gaze behaviors, but had no information on where the user was looking. The interactive model achieved a higher score for user ratings than the non-interactive model. In addition we analyzed the users' gaze behavior during the conversation with the virtual character.