Flexible automatic motion blending with registration curves
Proceedings of the 2003 ACM SIGGRAPH/Eurographics symposium on Computer animation
LyricAlly: automatic synchronization of acoustic musical signals and textual lyrics
Proceedings of the 12th annual ACM international conference on Multimedia
Music score alignment and computer accompaniment
Communications of the ACM - Music information retrieval
ISM '06 Proceedings of the Eighth IEEE International Symposium on Multimedia
Information Retrieval for Music and Motion
Information Retrieval for Music and Motion
Audio thumbnailing of popular music using chroma-based representations
IEEE Transactions on Multimedia
Lyrics-based audio retrieval and multimodal navigation in music collections
ECDL'07 Proceedings of the 11th European conference on Research and Advanced Technology for Digital Libraries
Hi-index | 0.00 |
For a single musical work, there often exists a large number of relevant digital documents including various audio recordings, MIDI files, or digitized sheet music. The general goal of music synchronization is to automatically align the multiple information sources related to a given musical work. In computing such alignments, one typically has to face a delicate tradeoff between robustness, accuracy, and efficiency. In this paper, we introduce various refinement strategies for music synchronization. First, we introduce novel audio features that combine the temporal accuracy of onset features with the robustness of chroma features. Then, we show how these features can be used within an efficient and robust multiscale synchronization framework. In addition we introduce an interpolation method for further increasing the temporal resolution. Finally, we report on our experiments based on polyphonic Western music demonstrating the respective improvements of the proposed refinement strategies.