A practical query-by-humming system for a large music database
MULTIMEDIA '00 Proceedings of the eighth ACM international conference on Multimedia
Cumulated gain-based evaluation of IR techniques
ACM Transactions on Information Systems (TOIS)
Input-agreement: a new mechanism for collecting data using human computation games
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of Tag Cloud Layouts: Task-Related Performance and Visual Exploration
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I
Music information retrieval using social tags and audio
IEEE Transactions on Multimedia - Special section on communities and media computing
Tag Clusters as Information Retrieval Interfaces
HICSS '10 Proceedings of the 2010 43rd Hawaii International Conference on System Sciences
Social audio features for advanced music retrieval interfaces
Proceedings of the international conference on Multimedia
Learning to tag from open vocabulary labels
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part II
Audio tag annotation and retrieval using tag count information
MMM'11 Proceedings of the 17th international conference on Advances in multimedia modeling - Volume Part I
Semantic Annotation and Retrieval of Music and Sound Effects
IEEE Transactions on Audio, Speech, and Language Processing
Going Through the Clouds: Search Overviews and Browsing of Movies
Proceeding of the 16th International Academic MindTrek Conference
Content-based search overviews and exploratory browsing of movies with MovieClouds
International Journal of Advanced Media and Communication
Hi-index | 0.00 |
This paper presents a novel content-based query-by-tag music search system for an untagged music database. We design a new tag query interface that allows users to input multiple tags with multiple levels of preference (denoted as an MTML query) by colorizing desired tags in a web-based tag cloud interface. When a user clicks and holds the left mouse button (or presses and holds his/her finger on a touch screen) on a desired tag, the color of the tag will change cyclically according to a color map (from dark blue to bright red), which represents the level of preference (from 0 to 1). In this way, the user can easily organize and check the query of multiple tags with multiple levels of preference through the colored tags. To effect the MTML content-based music retrieval, we introduce a probabilistic fusion model (denoted as GMFM), which consists of two mixture models, namely a Gaussian mixture model and a multinomial mixture model. GMFM can jointly model the auditory features and tag labels of a song. Two indexing methods and their corresponding matching methods, namely pseudo song-based matching and tag affinity-based matching, are incorporated into the pre-learned GMFM. We evaluate the proposed system on the MajorMiner and CAL-500 datasets. The experimental results demonstrate the effectiveness of GMFM and the potential of using MTML queries to search music from an untagged music database.