Visual Prosody: Facial Movements Accompanying Speech
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Natural head motion synthesis driven by acoustic prosodic features: Virtual Humans and Social Agents
Computer Animation and Virtual Worlds - CASA 2005
Analysis of Head Gesture and Prosody Patterns for Prosody-Driven Head-Gesture Animation
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Rickel Gaze Model: A Window on the Mind of a Virtual Human
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
Gaze, conversational agents and face-to-face communication
Speech Communication
Facilitative effects of communicative gaze and speech in human-robot cooperation
Proceedings of the 3rd international workshop on Affective interaction in natural environments
Hi-index | 0.00 |
In the present work we observe two subjects interacting in a collaborative task on a shared environment. One goal of the experiment is to measure the change in behavior with respect to gaze when one interactant is wearing dark glasses and hence his/her gaze is not visible by the other one. The results show that if one subject wears dark glasses while telling the other subject the position of a certain cube, the other subject needs significantly more time to locate and move this cube. Hence, eye gaze - when visible - of one subject looking at a certain cube speeds up the location of the cube by the other subject. The second goal of the currently ongoing work is to collect data on the multimodal behavior of one of the subjects by means of audio recording, eye gaze and head motion tracking in order to build a model that can be used to control a robot in a comparable scenario in future experiments.