Probabilistic latent semantic indexing
Proceedings of the 22nd annual international ACM SIGIR conference on Research and development in information retrieval
Machine learning in automated text categorization
ACM Computing Surveys (CSUR)
A tutorial on support vector regression
Statistics and Computing
Early versus late fusion in semantic video analysis
Proceedings of the 13th annual ACM international conference on Multimedia
Natural language processing of lyrics
Proceedings of the 13th annual ACM international conference on Multimedia
Early versus late fusion in semantic video analysis
Proceedings of the 13th annual ACM international conference on Multimedia
Understanding how bloggers feel: recognizing affect in blog posts
CHI '06 Extended Abstracts on Human Factors in Computing Systems
IEICE - Transactions on Information and Systems
MusicSense: contextual music recommendation using emotional allocation modeling
Proceedings of the 15th international conference on Multimedia
Affect Analysis of Web Forums and Blogs Using Correlation Ensembles
IEEE Transactions on Knowledge and Data Engineering
Short-Text classification based on ICA and LSA
ISNN'06 Proceedings of the Third international conference on Advnaces in Neural Networks - Volume Part II
A Regression Approach to Music Emotion Recognition
IEEE Transactions on Audio, Speech, and Language Processing
Automatic mood detection and tracking of music audio signals
IEEE Transactions on Audio, Speech, and Language Processing
Towards a new reading experience via semantic fusion of text and music
Proceedings of the 11th annual international ACM/IEEE joint conference on Digital libraries
Machine Recognition of Music Emotion: A Review
ACM Transactions on Intelligent Systems and Technology (TIST)
EMNLP-CoNLL '12 Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning
Hi-index | 0.00 |
The performance of categorical music emotion classification that divides emotion into classes and uses audio features alone for emotion classification has reached a limit due to the presence of a semantic gap between the object feature level and the human cognitive level of emotion perception. Motivated by the fact that lyrics carry rich semantic information of a song, we propose a multi-modal approach to help improve categorical music emotion classification. By exploiting both the audio features and the lyrics of a song, the proposed approach improves the 4-class emotion classification accuracy from 46.6% to 57.1%. The results also show that the incorporation of lyrics significantly enhances the classification accuracy of valence.