Exploring emotions and multimodality in digitally augmented puppeteering

  • Authors:
  • Lassi A. Liikkanen;Giulio Jacucci;Eero Huvio;Toni Laitinen;Elisabeth Andre

  • Affiliations:
  • Helsinki Institute for Information Technology HIIT, TKK, Finland;Helsinki Institute for Information Technology HIIT, TKK, Finland;Helsinki Institute for Information Technology HIIT, TKK, Finland;Helsinki Institute for Information Technology HIIT, TKK, Finland;University of Augsburg, Augsburg, Germany

  • Venue:
  • AVI '08 Proceedings of the working conference on Advanced visual interfaces
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, multimodal and affective technologies have been adopted to support expressive and engaging interaction, bringing up a plethora of new research questions. Among the challenges, two essential topics are 1) how to devise truly multimodal systems that can be used seamlessly for customized performance and content generation, and 2) how to utilize the tracking of emotional cues and respond to them in order to create affective interaction loops. We present PuppetWall, a multi-user, multimodal system intended for digitally augmented puppeteering. This application allows natural interaction to control puppets and manipulate playgrounds comprising background, props, and puppets. PuppetWall utilizes hand movement tracking, a multi-touch display and emotion speech recognition input for interfacing. Here we document the technical features of the system and an initial evaluation. The evaluation involved two professional actors and also aimed at exploring naturally emerging expressive speech categories. We conclude by summarizing challenges in tracking emotional cues from acoustic features and their relevance for the design of affective interactive systems.