The intelligent music editor: towards an automated platform for music analysis and editing

  • Authors:
  • Yuxiang Liu;Roger B. Dannenberg;Lianhong Cai

  • Affiliations:
  • Tsinghua National Laboratory for Information Science and Technology, Tsinghua University, Beijing, China;School of Computer Science, Carnegie Mellon University, Pittsburgh;Tsinghua National Laboratory for Information Science and Technology, Tsinghua University, Beijing, China

  • Venue:
  • ICIC'10 Proceedings of the Advanced intelligent computing theories and applications, and 6th international conference on Intelligent computing
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Digital music editing is a standard process in music production for correcting mistakes and enhancing quality, but this is tedious and time-consuming. The Intelligent Music Editor, or IMED, automates routine music editing tasks using advanced techniques for music transcription (especially score alignment), and signal processing. The IMED starts with multiple recorded tracks and a detailed score that specifies all of the notes to be played. A transcription algorithm locates notes in the recording and identifies their pitch. A scheduling model tracks instantaneous tempo of the recorded performance and determines adjusted timings for output tracks. A time-domain pitch modification/time stretching algorithm performs pitch correction and time adjustment. An empirical evaluation on a multi-track recording illustrates the proposed algorithms achieve an onset detection accuracy of 87% and a detailed subjective evaluation shows that the IMED improves pitch and timing accuracy while retaining the expressive nuance of the original recording.