Affect detection and an automated improvisational AI actor in E-drama

  • Authors:
  • Li Zhang;Marco Gillies;John A. Barnden;Robert J. Hendley;Mark G. Lee;Alan M. Wallington

  • Affiliations:
  • School of Computing and Technology, University of East London, London;Department of Computer Science, University College London, London;School of Computer Science, University of Birmingham, Birmingham;School of Computer Science, University of Birmingham, Birmingham;School of Computer Science, University of Birmingham, Birmingham;School of Computer Science, University of Birmingham, Birmingham

  • Venue:
  • ICMI'06/IJCAI'07 Proceedings of the ICMI 2006 and IJCAI 2007 international conference on Artifical intelligence for human computing
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Enabling machines to understand emotions and feelings of the human users in their natural language textual input during interaction is a challenging issue in Human Computing. Our work presented here has tried to make our contribution toward such machine automation. We report work on adding affect-detection to an existing e-drama program, a text-based software system for dramatic improvisation in simple virtual scenarios, for use primarily in learning contexts. The system allows a human director to monitor improvisations and make interventions, for instance in reaction to excessive, insufficient or inappropriate emotions in the characters' speeches. Within an endeavour to partially automate directors' functions, and to allow for automated affective bit-part characters, we have developed an affect-detection module. It is aimed at detecting affective aspects (concerning emotions, moods, value judgments, etc.) of human-controlled characters' textual "speeches". The work also accompanies basic research into how affect is conveyed linguistically. A distinctive feature of the project is a focus on the metaphorical ways in which affect is conveyed. Moreover, we have also introduced how the detected affective states activate the animation engine to produce gestures for human-controlled characters. The description of our approach in this paper is taken in part from our previous publications [1, 2] with new contributions mainly on metaphorical language processing (practically and theoretically), 3D emotional animation generation and user testing evaluation. Finally, Our work on affect detection in open-ended improvisational text contributes to the development of automatic understanding of human language and emotion. The generation of emotional believable animations based on detected affective states and the production of appropriate responses for the automated affective bit-part character based on the detection of affect contribute greatly to the ease and innovative user interface in e-drama, which leads to high-level user engagement and enjoyment.