A comparative study on content-based music genre classification
Proceedings of the 26th annual international ACM SIGIR conference on Research and development in informaion retrieval
Popular music retrieval by detecting mood
Proceedings of the 26th annual international ACM SIGIR conference on Research and development in informaion retrieval
AutoTag: a collaborative approach to automated tag assignment for weblog posts
Proceedings of the 15th international conference on World Wide Web
tagging, communities, vocabulary, evolution
CSCW '06 Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work
The complex dynamics of collaborative tagging
Proceedings of the 16th international conference on World Wide Web
Towards automatic extraction of event and place semantics from flickr tags
SIGIR '07 Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
The Benefit of Using Tag-Based Profiles
LA-WEB '07 Proceedings of the 2007 Latin American Web Conference
Flickr tag recommendation based on collective knowledge
Proceedings of the 17th international conference on World Wide Web
Tag-based social interest discovery
Proceedings of the 17th international conference on World Wide Web
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
Can all tags be used for search?
Proceedings of the 17th ACM conference on Information and knowledge management
Deriving music theme annotations from user tags
Proceedings of the 18th international conference on World wide web
Improving mood classification in music digital libraries by combining lyrics and audio
Proceedings of the 10th annual joint conference on Digital libraries
Web Semantics: Science, Services and Agents on the World Wide Web
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
Visual expression for organizing and accessing music collections in MusicWiz
ECDL'10 Proceedings of the 14th European conference on Research and advanced technology for digital libraries
A musical mood trajectory estimation method using lyrics and acoustic features
MIRUM '11 Proceedings of the 1st international ACM workshop on Music information retrieval with user-centered and multimodal strategies
What does music mood mean for real users?
Proceedings of the 2012 iConference
Generating ground truth for music mood classification using mechanical turk
Proceedings of the 12th ACM/IEEE-CS joint conference on Digital Libraries
Improving term extraction by utilizing user annotations
Proceedings of the 2013 ACM symposium on Document engineering
Hi-index | 0.00 |
Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation rocess. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.