User-Centered Control of Audio and Visual Expressive Feedback by Full-Body Movements

  • Authors:
  • Ginevra Castellano;Roberto Bresin;Antonio Camurri;Gualtiero Volpe

  • Affiliations:
  • InfoMus Lab, DIST - University of Genova, Viale Causa 13, I-16145, Genova, Italy;KTH, CSC School of Computer Science and Communication, Dept. of Speech Music and Hearing, Stockholm,;InfoMus Lab, DIST - University of Genova, Viale Causa 13, I-16145, Genova, Italy;InfoMus Lab, DIST - University of Genova, Viale Causa 13, I-16145, Genova, Italy

  • Venue:
  • ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we describe a system allowing users to express themselves through their full-body movement and gesture and to control in real-time the generation of an audio-visual feedback. The systems analyses in real-time the user's full-body movement and gesture, extracts expressive motion features and maps the values of the expressive motion features onto real-time control of acoustic parameters for rendering a music performance. At the same time, a visual feedback generated in real-time is projected on a screen in front of the users with their coloured silhouette, depending on the emotion their movement communicates. Human movement analysis and visual feedback generation were done with the EyesWeb software platform and the music performance rendering with pDM. Evaluation tests were done with human participants to test the usability of the interface and the effectiveness of the design.