A multimodal database for mimicry analysis

  • Authors:
  • Xiaofan Sun;Jeroen Lichtenauer;Michel Valstar;Anton Nijholt;Maja Pantic

  • Affiliations:
  • Human Media Interaction, University of Twente, Enschede, NL;Department of Computing, Imperial College, London, UK;Department of Computing, Imperial College, London, UK;Human Media Interaction, University of Twente, Enschede, NL;Human Media Interaction, University of Twente, Enschede, NL and Department of Computing, Imperial College, London, UK

  • Venue:
  • ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we introduce a multi-modal database for the analysis of human interaction, in particular mimicry, and elaborate on the theoretical hypotheses of the relationship between the occurrence of mimicry and human affect. The recorded experiments are designed to explore this relationship. The corpus is recorded with 18 synchronised audio and video sensors, and is annotated for many different phenomena, including dialogue acts, turn-taking, affect, head gestures, hand gestures, body movement and facial expression. Recordings were made of two experiments: a discussion on a political topic, and a role-playing game. 40 participants were recruited, all of whom selfreported their felt experiences. The corpus will be made available to the scientific community.