Active Vision for Goal-Oriented Humanoid Robot Walking

  • Authors:
  • Mototaka Suzuki;Tommaso Gritti;Dario Floreano

  • Affiliations:
  • Mahoney-Keck Center for Brain & Behavior Research, Columbia University Medical Center, New York, USA 10032;Video Processing & Analysis, Philips Research, Eindhoven, The Netherlands 5656 AB;Laboratory of Intelligent Systems, Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland CH-1015

  • Venue:
  • Creating Brain-Like Intelligence
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Complex visual tasks may be tackled with remarkably simple neural architectures generated by a co-evolutionary process of active vision and feature selection. This hypothesis has recently been tested in several robotic applications such as shape discrimination, car driving, indoor/outdoor navigation of a wheeled robot. Here we describe an experiment where this hypothesis is further examined in goal-oriented humanoid bipedal walking task. Hoap-2 humanoid robot equipped with a primitive vision system on its head is evolved while freely interacting with its environment. Unlike wheeled robots, bipedal walking robots are exposed to largely perturbed visual input caused by their own walking dynamics. We show that evolved robots are capable of coping with the dynamics and of accomplishing the task by means of active, efficient camera control.