EyesWeb - Toward Gesture and Affect Recognition in Dance/Music Interactive Systems

  • Authors:
  • Antonio Camurri;Matteo Ricchetti;Riccardo Trocca

  • Affiliations:
  • Università di Genova;Università di Genova;Università di Genova

  • Venue:
  • ICMCS '99 Proceedings of the IEEE International Conference on Multimedia Computing and Systems - Volume 2
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

The EyesWeb project concerns the development of a system for real-time analysis of full-body movement and gesture of one or more humans, with a particular focus to affect or emotion content. Such information is used to control and generate sound, music, visual media, and to control actuators (e.g., robots). A main goal is to explore extensions of music language toward gesture and visual languages. The paper describes the state of the art of the project, the design of the current EyesWeb system modules (hardware and software), their experimenting in public events, and discusses ongoing developments.