Self-calibrating marker tracking in 3d with event-based vision sensors

  • Authors:
  • Georg R. Müller;Jörg Conradt

  • Affiliations:
  • Technische Universität München, München, Germany;Technische Universität München, München, Germany

  • Venue:
  • ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Following an object's position relative to oneself is a fundamental functionality required in intelligent real-world interacting robotic systems. This paper presents a computationally efficient vision based 3D tracking system, which can ultimately operate in real-time on autonomous mobile robots in cluttered environments. At the core of the system, two neural inspired event-based dynamic vision sensors (eDVS) independently track a high frequency flickering LED in their respective 2D angular coordinate frame. A self-adjusted feed-forward neural network maps those independent 2D angular coordinates into a Cartesian 3D position in world coordinates. During an initial calibration phase, an object composed of multiple independent markers with known geometry provides relative position information between those markers for network training (without ever using absolute world coordinates for training). In a subsequent application phase tracking a single marker yields position estimates relative to sensor origin, while tracking multiple markers provides additional orientation. The neural inspired vision-based tracking system runs in real-time on ARM7 microcontrollers, without the need for an external PC.