EmoHeart: conveying emotions in second life based on affect sensing from text

  • Authors:
  • Alena Neviarouskaya;Helmut Prendinger;Mitsuru Ishizuka

  • Affiliations:
  • Department of Information and Communication Engineering, University of Tokyo, Tokyo, Japan;Digital Content and Media Sciences Research Division, National Institute of Informatics, Tokyo, Japan;Department of Information and Communication Engineering, University of Tokyo, Tokyo, Japan

  • Venue:
  • Advances in Human-Computer Interaction - Special issue on emotion-aware natural interaction
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

The 3D virtual world of "Second Life" imitates a form of real life by providing a space for rich interactions and social events. Second Life encourages people to establish or strengthen interpersonal relations, to share ideas, to gain new experiences, and to feel genuine emotions accompanying all adventures of virtual reality. Undoubtedly, emotions play a powerful role in communication. However, to trigger visual display of user's affective state in a virtual world, user has to manually assign appropriate facial expression or gesture to own avatar. Affect sensing from text, which enables automatic expression of emotions in the virtual environment, is a method to avoid manual control by the user and to enrich remote communications effortlessly. In this paper, we describe a lexical rule-based approach to recognition of emotions from text and an application of the developed Affect Analysis Model in Second Life. Based on the result of the Affect Analysis Model, the developed EmoHeart ("object" in Second Life) triggers animations of avatar facial expressions and visualizes emotion by heart-shaped textures.