Making a Robot Dance to Music Using Chaotic Itinerancy in a Network of FitzHugh-Nagumo Neurons

  • Authors:
  • Jean-Julien Aucouturier;Yuta Ogai;Takashi Ikegami

  • Affiliations:
  • Department of General Systems Studies Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan 153-8902;Department of General Systems Studies Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan 153-8902;Department of General Systems Studies Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan 153-8902

  • Venue:
  • Neural Information Processing
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a technique to make a robot execute free and solitary dance movements on music, in a manner which simulates the dynamic alternations between synchronisation and autonomy typically observed in human behaviour. In contrast with previous approaches, we preprogram neither the dance patterns nor their alternation, but rather build in basic dynamics in the robot, and let the behaviour emerge in a seemingly autonomous manner. The robot motor commands are generated in real-time by converting the output of a neural network processing a sequence of pulses corresponding to the beats of the music being danced to. The spiking behaviour of individual neurons is controlled by a biologically-inspired model (FitzHugh-Nagumo). Under appropriate parameters, the network generates chaotic itinerant behaviour among low-dimensional local attractors. A robot controlled this way exhibits a variety of motion styles, some being periodic and strongly coupled to the musical rhythm and others being more independent, as well as spontaneous jumps from one style of motion to the next. The resulting behaviour is completely deterministic (as the solution of a non-linear dynamical system), adaptive to the music being played, and believed to be an interesting compromise between synchronisation and autonomy.