From Vision Sensor to Actuators, Spike Based Robot Control through Address-Event-Representation

  • Authors:
  • A. Jimenez-Fernandez;C. Lujan-Martinez;R. Paz-Vicente;A. Linares-Barranco;G. Jimenez;A. Civit

  • Affiliations:
  • Departamento de Arquitectura y Tecnología de Computadores, Universidad de Sevilla, Sevilla, Spain 41012;Departamento de Arquitectura y Tecnología de Computadores, Universidad de Sevilla, Sevilla, Spain 41012;Departamento de Arquitectura y Tecnología de Computadores, Universidad de Sevilla, Sevilla, Spain 41012;Departamento de Arquitectura y Tecnología de Computadores, Universidad de Sevilla, Sevilla, Spain 41012;Departamento de Arquitectura y Tecnología de Computadores, Universidad de Sevilla, Sevilla, Spain 41012;Departamento de Arquitectura y Tecnología de Computadores, Universidad de Sevilla, Sevilla, Spain 41012

  • Venue:
  • IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

One field of the neuroscience is the neuroinformatic whose aim is to develop auto-reconfigurable systems that mimic the human body and brain. In this paper we present a neuro-inspired spike based mobile robot. From commercial cheap vision sensors converted into spike information, through spike filtering for object recognition, to spike based motor control models. A two wheel mobile robot powered by DC motors can be autonomously controlled to follow a line drown in the floor. This spike system has been developed around the well-known Address-Event-Representation mechanism to communicate the different neuro-inspired layers of the system. RTC lab has developed all the components presented in this work, from the vision sensor, to the robot platform and the FPGA based platforms for AER processing.