Sentic maxine: multimodal affective fusion and emotional paths

  • Authors:
  • Isabelle Hupont;Erik Cambria;Eva Cerezo;Amir Hussain;Sandra Baldassarri

  • Affiliations:
  • Instituto Tecnológico de Aragón, Spain;National University of Singapore, Singapore;University of Zaragoza, Spain;University of Stirling, United Kingdom;University of Zaragoza, Spain

  • Venue:
  • ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part II
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

The capability of perceiving and expressing emotions through different modalities is a key issue for the enhancement of human-agent interaction. In this paper, an architecture for the development of intelligent multimodal affective interfaces is presented. It is based on the integration of Sentic Computing, a new opinion mining and sentiment analysis paradigm based on AI and Semantic Web techniques, with a facial emotional classifier and Maxine, a powerful multimodal animation engine for managing virtual agents and 3D scenarios. One of the main distinguishing features of the system is that it does not simply perform emotional classification in terms of a set of discrete emotional labels but it operates in a novel continuous 2D emotional space, enabling the output of a continuous emotional path that characterizes user's affective progress over time. Another key factor is the fusion methodology proposed, which is able to fuse any number of unimodal categorical modules, with very different time-scales, output labels and recognition success rates, in a simple and scalable way.