Visualizing music and audio using self-similarity
MULTIMEDIA '99 Proceedings of the seventh ACM international conference on Multimedia (Part 1)
Content-based organization and visualization of music archives
Proceedings of the tenth ACM international conference on Multimedia
Exploring Music Collections by Browsing Different Views
Computer Music Journal
Music structure analysis by finding repeated parts
Proceedings of the 1st ACM workshop on Audio and music computing multimedia
Similarity matrix processing for music structure analysis
Proceedings of the 1st ACM workshop on Audio and music computing multimedia
Exploiting music structures for digital libraries
Proceedings of the 11th annual international ACM/IEEE joint conference on Digital libraries
IEEE Transactions on Audio, Speech, and Language Processing
Hi-index | 0.00 |
To identify repeated patterns and contrasting sections in music, it is common to use self-similarity matrices (SSMs) to visualize and estimate structure. We introduce a novel application for SSMs derived from audio recordings: using them to learn about the potential reasoning behind a listener's annotation. We use SSMs generated by musically-motivated audio features at various timescales to represent contributions to a structural annotation. Since a listener's attention can shift among musical features (e.g., rhythm, timbre, and harmony) throughout a piece, we further break down the SSMs into section-wise components and use quadratic programming (QP) to minimize the distance between a linear sum of these components and the annotated description. We posit that the optimal section-wise weights on the feature components may indicate the features to which a listener attended when annotating a piece, and thus may help us to understand why two listeners disagreed about a piece's structure. We discuss some examples that substantiate the claim that feature relevance varies throughout a piece, using our method to investigate differences between listeners' interpretations, and lastly propose some variations on our method.