Expressive interfaces

  • Authors:
  • Antonio Camurri;Barbara Mazzarino;Gualtiero Volpe

  • Affiliations:
  • DIST – University of Genova, InfoMus Lab – Laboratorio di Informatica Musicale, Italy;DIST – University of Genova, InfoMus Lab – Laboratorio di Informatica Musicale, Italy;DIST – University of Genova, InfoMus Lab – Laboratorio di Informatica Musicale, Italy

  • Venue:
  • Cognition, Technology and Work
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Analysis of expressiveness in human gesture can lead to new paradigms for the design of improved human-machine interfaces, thus enhancing users’ participation and experience in mixed reality applications and context-aware mediated environments. The development of expressive interfaces decoding the highly affective information gestures convey opens novel perspectives in the design of interactive multimedia systems in several application domains: performing arts, museum exhibits, edutainment, entertainment, therapy, and rehabilitation. This paper describes some recent developments in our research on expressive interfaces by presenting computational models and algorithms for the real-time analysis of expressive gestures in human full-body movement. Such analysis is discussed both as an example and as a basic component for the development of effective expressive interfaces. As a concrete result of our research, a software platform named EyesWeb was developed (http://www.eyesweb.org). Besides supporting research, EyesWeb has also been employed as a concrete tool and open platform for developing real-time interactive applications.