Introduction to the theory of neural computation
Introduction to the theory of neural computation
Artificial Intelligence
What is the goal of sensory coding?
Neural Computation
Adaptive individuals in evolving populations: models and algorithms
Adaptive individuals in evolving populations: models and algorithms
Unsupervised learning
Understanding intelligence
Active Perception
Autonomous Robots
Evolutionary Robotics: The Biology, Intelligence, and Technology of Self-Organizing Machines
Evolutionary Robotics: The Biology, Intelligence, and Technology of Self-Organizing Machines
The world from a cat’s perspective – statistics of natural videos
Biological Cybernetics
Coevolution of active vision and feature selection
Biological Cybernetics
Evolution of Adaptive Synapses: Robots with Fast Adaptive Behavior in New Environments
Evolutionary Computation
Landscapes, learning costs, and genetic assimilation
Evolutionary Computation
Neural Networks - 2005 Special issue: IJCNN 2005
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Active Vision for Goal-Oriented Humanoid Robot Walking
Creating Brain-Like Intelligence
Evolutionary robotics: the next-generation-platform for on-line and on-board artificial evolution
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Evolutionary active vision toward three dimensional landmark-navigation
SAB'06 Proceedings of the 9th international conference on From Animals to Animats: simulation of Adaptive Behavior
Hi-index | 0.00 |
In this paper, we describe the artificial evolution of adaptive neural controllers for an outdoor mobile robot equipped with a mobile camera. The robot can dynamically select the gazing direction by moving the body and/or the camera. The neural control system, which maps visual information to motor commands, is evolved online by means of a genetic algorithm, but the synaptic connections (receptive fields) from visual photoreceptors to internal neurons can also be modified by Hebbian plasticity while the robot moves in the environment. We show that robots evolved in physics-based simulations with Hebbian visual plasticity display more robust adaptive behavior when transferred to real outdoor environments as compared to robots evolved without visual plasticity. We also show that the formation of visual receptive fields is significantly and consistently affected by active vision as compared to the formation of receptive fields with grid sample images in the environment of the robot. Finally, we show that the interplay between active vision and receptive field formation amounts to the selection and exploitation of a small and constant subset of visual features available to the robot.