Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems - Volume 1
GNetIc --- Using Bayesian Decision Networks for Iconic Gesture Generation
IVA '09 Proceedings of the 9th International Conference on Intelligent Virtual Agents
Degrees of grounding based on evidence of understanding
SIGdial '08 Proceedings of the 9th SIGdial Workshop on Discourse and Dialogue
Recognizing plan/goal abandonment
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Interactive gesture in dialogue: a PTT model
SIGDIAL '09 Proceedings of the SIGDIAL 2009 Conference: The 10th Annual Meeting of the Special Interest Group on Discourse and Dialogue
MODELING THE PRODUCTION OF COVERBAL ICONIC GESTURES BY LEARNING BAYESIAN DECISION NETWORKS
Applied Artificial Intelligence - Intelligent Virtual Agents
IVA'10 Proceedings of the 10th international conference on Intelligent virtual agents
On factoring out a gesture typology from the Bielefeld speech-and-gesture-alignment corpus (SAGA)
GW'09 Proceedings of the 8th international conference on Gesture in Embodied Communication and Human-Computer Interaction
Using group history to identify character-directed utterances in multi-child interactions
SIGDIAL '12 Proceedings of the 13th Annual Meeting of the Special Interest Group on Discourse and Dialogue
GW'11 Proceedings of the 9th international conference on Gesture and Sign Language in Human-Computer Interaction and Embodied Communication
Hi-index | 0.00 |
Although not very well investigated, a crucial aspect of gesture use in dialogues is to regulate the organisation of the interaction. People use gestures decisively, for example to indicate that they want someone to take the turn, to 'brush away' what someone else said, or to acknowledge others' contributions. We present first insights from a corpus-based investigation of how gestures are used to regulate dialogue, and we provide first results from an account to capture these phenomena in agent-based communication simulations. By advancing a model for autonomous gesture generation to also cover gesture interpretation, this account enables a full gesture turn exchange cycle of generation, understanding and acceptance/generation in virtual conversational agents.