The eNTERFACE'05 Audio-Visual Emotion Database

  • Authors:
  • O. Martin;I. Kotsia;B. Macq;I. Pitas

  • Affiliations:
  • Université catholique de Louvain, Belgium;Aristotle University of Thessaloniki, Greece;Université catholique de Louvain, Belgium;Aristotle University of Thessaloniki, Greece

  • Venue:
  • ICDEW '06 Proceedings of the 22nd International Conference on Data Engineering Workshops
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents an audio-visual emotion database that can be used as a reference database for testing and evaluating video, audio or joint audio-visual emotion recognition algorithms. Additional uses may include the evaluation of algorithms performing other multimodal signal processing tasks, such as multimodal person identification or audio-visual speech recognition. This paper presents the difficulties involved in the construction of such a multimodal emotion database and the different protocols that have been used to cope with these difficulties. It describes the experimental setup used for the experiments and includes a section related to the segmentation and selection of the video samples, in such a way that the database contains only video sequences carrying the desired affective information. This database is made publicly available for scientific research purposes.