Conveying Emotions through Facially Animated Avatars in Networked Virtual Environments

  • Authors:
  • Fabian Fiore;Peter Quax;Cedric Vanaken;Wim Lamotte;Frank Reeth

  • Affiliations:
  • Hasselt University - tUL - IBBT Expertise Centre for Digital Media, Diepenbeek, Belgium BE-3590;Hasselt University - tUL - IBBT Expertise Centre for Digital Media, Diepenbeek, Belgium BE-3590;Hasselt University - tUL - IBBT Expertise Centre for Digital Media, Diepenbeek, Belgium BE-3590;Hasselt University - tUL - IBBT Expertise Centre for Digital Media, Diepenbeek, Belgium BE-3590;Hasselt University - tUL - IBBT Expertise Centre for Digital Media, Diepenbeek, Belgium BE-3590

  • Venue:
  • Motion in Games
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper, our objective is to facilitate the way in which emotion is conveyed through avatars in virtual environments. The established way of achieving this includes the end-user having to manually select his/her emotional state through a text base interface (using emoticons and/or keywords) and applying these pre-defined emotional states on avatars. In contrast to this rather trivial solution, we envisage a system that enables automatic extraction of emotion-related metadata from a video stream, most often originating from a webcam. Contrary to the seemingly trivial solution of sending entire video streams -- which is an optimal solution but often prohibitive in terms of bandwidth usage -- this metadata extraction process enables the system to be deployed in large-scale environments, as the bandwidth required for the communication channel is severely limited.