ETMTNLP '02 Proceedings of the ACL-02 Workshop on Effective tools and methodologies for teaching natural language processing and computational linguistics - Volume 1
Gesture modeling and animation based on a probabilistic re-creation of speaker style
ACM Transactions on Graphics (TOG)
Does the contingency of agents' nonverbal feedback affect users' social anxiety?
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
Politeness and alignment in dialogues with a virtual guide
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems - Volume 1
An alignment-capable microplanner for natural language generation
ENLG '09 Proceedings of the 12th European Workshop on Natural Language Generation
Individual and domain adaptation in sentence planning for dialogue
Journal of Artificial Intelligence Research
Individuality and alignment in generated dialogues
INLG '06 Proceedings of the Fourth International Natural Language Generation Conference
Towards building a virtual counselor: modeling nonverbal behavior during intimate self-disclosure
Proceedings of the 11th International Conference on Autonomous Agents and Multiagent Systems - Volume 1
Hi-index | 0.00 |
Speakers in dialogue tend to adapt to each other by starting to use similar lexical items, syntactic structures, or gestures. This behaviour, called alignment, may serve important cognitive, communicative and social functions (such as speech facilitation, grounding and rapport). Our aim is to enable and study the effects of these subtle aspects of communication in virtual conversational agents. Building upon a model for autonomous speech and gesture generation, we describe an approach to make the agent's multimodal behaviour adaptive in an interactive manner. This includes (1) an activation-based microplanner that makes linguistic choices based on lexical and syntactic priming, and (2) an empirically grounded gesture generation such that linguistic priming parallels concordant gestural adaptation. First results show that the agent aligns to its interaction partners by picking up their syntactic structures and lexical items in its subsequent utterances. These changes in the agent's verbal behaviour also have a direct influence on gestural expressions.