The Handbook of Brain Theory and Neural Networks
The Handbook of Brain Theory and Neural Networks
Hi-index | 0.00 |
Following an object's position relative to oneself is a fundamental functionality required in intelligent real-world interacting robotic systems. This paper presents a computationally efficient vision based 3D tracking system, which can ultimately operate in real-time on autonomous mobile robots in cluttered environments. At the core of the system, two neural inspired event-based dynamic vision sensors (eDVS) independently track a high frequency flickering LED in their respective 2D angular coordinate frame. A self-adjusted feed-forward neural network maps those independent 2D angular coordinates into a Cartesian 3D position in world coordinates. During an initial calibration phase, an object composed of multiple independent markers with known geometry provides relative position information between those markers for network training (without ever using absolute world coordinates for training). In a subsequent application phase tracking a single marker yields position estimates relative to sensor origin, while tracking multiple markers provides additional orientation. The neural inspired vision-based tracking system runs in real-time on ARM7 microcontrollers, without the need for an external PC.