Comparing emotions using acoustics and human perceptual dimensions

  • Authors:
  • Keshi Dai;Harriet Fell;Joel MacAuslan

  • Affiliations:
  • Northeastern University, Boston, MA, USA;Northeastern University, Boston, MA, USA;Speech Technology and Applied Research, Bedford, MA, USA

  • Venue:
  • CHI '09 Extended Abstracts on Human Factors in Computing Systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Understanding the difference between emotions based on acoustic features is important for computer recognition and classification of emotions. We conducted a study of human perception of six emotions based on three perceptual dimensions and compared the human classification with machine classification based on many acoustic parameters. Results show that the six emotions cluster differently according to acoustic features and to perceptual dimensions. Acoustic features fail to characterize the perceptual dimension of valence. More research is needed to find acoustic features that have a close relation to human perception.