Personalized music emotion classification via active learning

  • Authors:
  • Dan Su;Pascale Fung

  • Affiliations:
  • Hong Kong University of Science & Technology, Hong Kong, Hong Kong;Hong Kong University of Science & Technology, Hong Kong, Hong Kong

  • Venue:
  • Proceedings of the second international ACM workshop on Music information retrieval with user-centered and multimodal strategies
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose using active learning in a personalized music emotion classification framework to solve subjectivity, one of the most challenging issues in music emotion recognition (MER). Personalization is the most direct method to tackle subjectivity in MER. However, almost all of the state-of-the-art personalized MER systems require a huge amount user participation, which is a non-neglegible problem in real systems. Active learning seeks to reduce human annotation efforts, by automatically selecting the most informative instances for human relabeling to train the classifier. Experimental results on a Chinese music dataset demonstrate that our method can effectively reduce as much as 80% of the requirement of human annotation without decreasing F-measure. Different query selection criteria of active learning were also investigated, and we found that informativeness criterion which selects the most uncertain instances performed best in general. We finally show the condition of successful active learning in personalized MER is that label consistency from the same user.