Learning to dance through interactive evolution

  • Authors:
  • Greg A. Dubbin;Kenneth O. Stanley

  • Affiliations:
  • School of Electrical Engineering and Computer Science, University of Central Florida, Orlando, FL;School of Electrical Engineering and Computer Science, University of Central Florida, Orlando, FL

  • Venue:
  • EvoCOMNET'10 Proceedings of the 2010 international conference on Applications of Evolutionary Computation - Volume Part II
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

A relatively rare application of artificial intelligence at the nexus of art and music is dance. The impulse shared by all humans to express ourselves through dance represents a unique opportunity to artificially capture human creative expression. In particular, the spontaneity and relative ease of moving to the music without any overall plan suggests a natural connection between temporal patterns and motor control. To explore this potential, this paper presents a model called Dance Evolution, which allows the user to train virtual humans to dance to MIDI songs or raw audio, that is, the dancers can dance to any song heard on the radio, including the latest pop music. The dancers are controlled by artificial neural networks (ANNs) that “hear” MIDI sequences or raw audio processed through a discrete Fourier transform-based technique. ANNs learn to dance in new ways through an interactive evolutionary process driven by the user. The main result is that when motion is expressed as a function of sound the effect is a plausible approximation of the natural human tendency to move to music.