Finding geographically representative music via social media
MIRUM '11 Proceedings of the 1st international ACM workshop on Music information retrieval with user-centered and multimodal strategies
Hi-index | 0.00 |
This work identifies relevant songs from a user’s personal music collection to accompany pictures of an event. The event’s pictures are analyzed to extract aggregated semantic concepts in a variety of dimensions, including scene type, geospatial information, and event type, along with user-provided keywords. These semantic concepts are then used to form a search query against a song database based primarily on the song lyrics. Songs are scored using probabilistic techniques to come up with a rank ordered list of candidate songs that could then be used as, e.g., the audio track in a multimedia slideshow.