Interactively evolving harmonies through functional scaffolding

  • Authors:
  • Amy K. Hoover;Paul A. Szerlip;Kenneth O. Stanley

  • Affiliations:
  • University of Central Florida, Orlando, FL, USA;University of Central Florida, Orlando, FL, USA;University of Central Florida, Orlando, FL, USA

  • Venue:
  • Proceedings of the 13th annual conference on Genetic and evolutionary computation
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

While the real-time focus of today's automated accompaniment generators can benefit instrumentalists and vocalists in their practice, improvisation, or performance, an opportunity remains specifically to assist novice composers. This paper introduces a novel such approach based on evolutionary computation called functional scaffolding for musical composition (FSMC), which helps the user explore potential accompaniments for existing musical pieces, or scaffolds. The key idea is to produce accompaniment as a function of the scaffold, thereby inheriting from its inherent style and texture. To implement this idea, accompaniments are represented by a special type of neural network called a compositional pattern producing network (CPPN), which produces harmonies by elaborating on and exploiting regularities in pitches and rhythms found in the scaffold. This paper focuses on how inexperienced composers can personalize accompaniments by first choosing any MIDI scaffold, then selecting which parts (e.g. the piano, guitar, or bass guitar) the CPPN can hear, and finally customizing and refining the computer-generated accompaniment through an interactive process of selection and mutation of CPPNs called interactive evolutionary computation (IEC). The potential of this approach is demonstrated by following the evolution of a specific accompaniment and studying whether listeners appreciate the results.