On modelling emotional responses to rhythm features

  • Authors:
  • Jocelynn Cu;Rafael Cabredo;Roberto Legaspi;Merlin Teodosia Suarez

  • Affiliations:
  • Center for Empathic Human-Computer Interactions, De La Salle University, Philippines;Institute of Scientific and Industrial Research, Osaka University, Japan;Institute of Scientific and Industrial Research, Osaka University, Japan;Center for Empathic Human-Computer Interactions, De La Salle University, Philippines

  • Venue:
  • PRICAI'12 Proceedings of the 12th Pacific Rim international conference on Trends in Artificial Intelligence
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Rhythm is one of the most essential elements of music that can easily capture the attention of the listener. In this study, we explored various rhythm features and used them to build emotion models. The emotion labels used are based on Thayers Model of Mood, which includes contentment, exuberance, anxiety, and depression. Empirical results identify 11 low-level rhythmic features to classify music emotion. We also determined that KStar can be used to build user-specific emotion models with a precision value of 0.476, recall of 0.480, and F-measure of 0.475.