Exploiting online music tags for music emotion classification

  • Authors:
  • Yu-Ching Lin;Yi-Hsuan Yang;Homer H. Chen

  • Affiliations:
  • National Taiwan University, Taipei, Taiwan;Academia Sinica, Taipei, Taiwan;National Taiwan University, Taipei, Taiwan

  • Venue:
  • ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP) - Special section on ACM multimedia 2010 best paper candidates, and issue on social media
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The online repository of music tags provides a rich source of semantic descriptions useful for training emotion-based music classifier. However, the imbalance of the online tags affects the performance of emotion classification. In this paper, we present a novel data-sampling method that eliminates the imbalance but still takes the prior probability of each emotion class into account. In addition, a two-layer emotion classification structure is proposed to harness the genre information available in the online repository of music tags. We show that genre-based grouping as a precursor greatly improves the performance of emotion classification. On the average, the incorporation of online genre tags improves the performance of emotion classification by a factor of 55% over the conventional single-layer system. The performance of our algorithm for classifying 183 emotion classes reaches 0.36 in example-based f-score.