A Framework for Supporting Multimodal Conversational Characters in a Multi-agent System

  • Authors:
  • Yasmine Arafa;Abe Mamdani

  • Affiliations:
  • -;-

  • Venue:
  • ICMI '00 Proceedings of the Third International Conference on Advances in Multimodal Interfaces
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper discusses the computational framework for enabling multimodal conversational interface agents embodied in lifelike characters within a multi-agent environment. It is generally argued that one of the problems with such interface characters today is their inability to respond believably or adequately to the context of an interaction and the surrounding environment. Affective behaviour is used to better express responses to interaction context and provide more believable visual expressive responses. We describe an operational approach to enabling the computational perception required for the automated generation of affective behaviour through inter-agent communication in multi-agent real-time environments. The research is investigating the potential of extending current agent communication languages so as they not only convey the semantic content of knowledge exchange but also they can communicate affective attitudes about the shared knowledge. Providing a necessary component of the framework required for autonomous agent development with which we may bridge the gap between current research in psychological theory and practical implementation of social multi-agent systems.