Refinement Strategies for Music Synchronization

  • Authors:
  • Sebastian Ewert;Meinard Müller

  • Affiliations:
  • Institut für Informatik III, Universität Bonn, Bonn, Germany 53117;Max-Planck-Institut für Informatik, Saarbrücken, Germany 66123

  • Venue:
  • Computer Music Modeling and Retrieval. Genesis of Meaning in Sound and Music
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

For a single musical work, there often exists a large number of relevant digital documents including various audio recordings, MIDI files, or digitized sheet music. The general goal of music synchronization is to automatically align the multiple information sources related to a given musical work. In computing such alignments, one typically has to face a delicate tradeoff between robustness, accuracy, and efficiency. In this paper, we introduce various refinement strategies for music synchronization. First, we introduce novel audio features that combine the temporal accuracy of onset features with the robustness of chroma features. Then, we show how these features can be used within an efficient and robust multiscale synchronization framework. In addition we introduce an interpolation method for further increasing the temporal resolution. Finally, we report on our experiments based on polyphonic Western music demonstrating the respective improvements of the proposed refinement strategies.