A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior

  • Authors:
  • Hatice Gunes;Massimo Piccardi

  • Affiliations:
  • University of Technology, Sydney (UTS), Australia;University of Technology, Sydney (UTS), Australia

  • Venue:
  • ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 01
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

To be able to develop and test robust affective multimodal systems, researchers need access to novel databases containing representative samples of human multi-modal expressive behavior. The creation of such databases requires a major effort in the definition of representative behaviors, the choice of expressive modalities, and the collection and labeling of large amount of data. At present, public databases only exist for single expressive modalities such as facial expression analysis. There also exist a number of gesture databases of static and dynamic hand postures and dynamic hand gestures. However, there is not a readily available database combining affective face and body information in a genuine bimodal manner. Accordingly, in this paper, we present a bimodal database recorded by two highresolution cameras simultaneously for use in automatic analysis of human nonverbal affective behavior.