Can computer personalities be human personalities?
International Journal of Human-Computer Studies
Adaptive agents and personality change: complementarity versus similarity as forms of adaptation
Conference Companion on Human Factors in Computing Systems
Does computer-generated speech manifest personality? an experimental test of similarity-attraction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
IEEE Intelligent Systems
ILEX: an architecture for a dynamic hypertext generation system
Natural Language Engineering
ERIC: a generic rule-based framework for an affective embodied commentary agent
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
Making Them Remember—Emotional Virtual Characters with Memory
IEEE Computer Graphics and Applications
ENLG '07 Proceedings of the Eleventh European Workshop on Natural Language Generation
LaTeCH-SHELT&R '09 Proceedings of the EACL 2009 Workshop on Language Technology and Resources for Cultural Heritage, Social Sciences, Humanities, and Education
The good, the bad and the neutral: affective profile in dialog system-user communication
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
An adaptive dialogue system with online dialogue policy learning
SETN'12 Proceedings of the 7th Hellenic conference on Artificial Intelligence: theories and applications
A comparative study of reinforcement learning techniques on dialogue management
EACL '12 Proceedings of the Student Research Workshop at the 13th Conference of the European Chapter of the Association for Computational Linguistics
Hi-index | 0.00 |
An enduring challenge in human-computer interaction (HCI) research is the creation of natural and intuitive interfaces. Besides the obvious requirement that such interfaces communicate over modalities such as natural language (especially spoken) and gesturing that are more natural for humans, exhibiting affect and adaptivity have also been identified as important factors to the interface's acceptance by the user. In the work presented here, we propose a novel architecture for affective and multimodal dialogue systems that allows explicit control over the personality traits that we want the system to exhibit. More specifically, we approach personality as a means of synthesising different, and possibly conflicting, adaptivity models into an overall model to be used to drive the interaction components of the system. Furthermore, this synthesis is performed in the presence of domain knowledge, so that domain structure and relations influence the results of the calculation.