An embodied dialogue system with personality and emotions

  • Authors:
  • Stasinos Konstantopoulos

  • Affiliations:
  • NCSR 'Demokritos', Athens, Greece

  • Venue:
  • CDS '10 Proceedings of the 2010 Workshop on Companionable Dialogue Systems
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

An enduring challenge in human-computer interaction (HCI) research is the creation of natural and intuitive interfaces. Besides the obvious requirement that such interfaces communicate over modalities such as natural language (especially spoken) and gesturing that are more natural for humans, exhibiting affect and adaptivity have also been identified as important factors to the interface's acceptance by the user. In the work presented here, we propose a novel architecture for affective and multimodal dialogue systems that allows explicit control over the personality traits that we want the system to exhibit. More specifically, we approach personality as a means of synthesising different, and possibly conflicting, adaptivity models into an overall model to be used to drive the interaction components of the system. Furthermore, this synthesis is performed in the presence of domain knowledge, so that domain structure and relations influence the results of the calculation.