Affective computing with primary and secondary emotions in a virtual human

  • Authors:
  • Christian Becker-Asano;Ipke Wachsmuth

  • Affiliations:
  • Faculty of Technology, University of Bielefeld, Bielefeld, Germany 33594 and ATR, Kyoto, Japan 619-0288;Faculty of Technology, University of Bielefeld, Bielefeld, Germany 33594

  • Venue:
  • Autonomous Agents and Multi-Agent Systems
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

We introduce the WASABI ([W]ASABI [A]ffect [S]imulation for [A]gents with [B]elievable [I]nteractivity) Affect Simulation Architecture, in which a virtual human's cognitive reasoning capabilities are combined with simulated embodiment to achieve the simulation of primary and secondary emotions. In modeling primary emotions we follow the idea of "Core Affect" in combination with a continuous progression of bodily feeling in three-dimensional emotion space (PAD space), that is subsequently categorized into discrete emotions. In humans, primary emotions are understood as onto-genetically earlier emotions, which directly influence facial expressions. Secondary emotions, in contrast, afford the ability to reason about current events in the light of experiences and expectations. By technically representing aspects of each secondary emotion's connotative meaning in PAD space, we not only assure their mood-congruent elicitation, but also combine them with facial expressions, that are concurrently driven by primary emotions. Results of an empirical study suggest that human players in a card game scenario judge our virtual human MAX significantly older when secondary emotions are simulated in addition to primary ones.