Music scene-adaptive harmonic dictionary for unsupervised note-event detection

  • Authors:
  • J. J. Carabias-Orti;P. Vera-Candeas;F. J. Cañadas-Quesada;N. Ruiz-Reyes

  • Affiliations:
  • Telecommunication Engineering Department, University of Jaén, Linares, Jaén, Spain;Telecommunication Engineering Department, University of Jaén, Linares, Jaén, Spain;Telecommunication Engineering Department, University of Jaén, Linares, Jaén, Spain;Telecommunication Engineering Department, University of Jaén, Linares, Jaén, Spain

  • Venue:
  • IEEE Transactions on Audio, Speech, and Language Processing
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Harmonic decompositions are a powerful tool dealing with polyphonic music signals in some potential applications such as music visualization, music transcription and instrument recognition. The usefulness of a harmonic decomposition relies on the design of a proper harmonic dictionary. Music scene-adaptive harmonic atoms have been used with this purpose. These atoms are adapted to the musical instruments and to the music scene, including aspects related with the venue, musician, and other relevant acoustic properties. In this paper, an unsupervised process to obtain music scene-adaptive spectral patterns for each MIDI-note is proposed. Furthermore, the obtained harmonic dictionary is applied to note-event detection with matching pursuits. In the case of a music database that only consists of one-instrument signals, promising results (high accuracy and low error rate) have been achieved for note-event detection.