Simultaneous localization and mapping for event-based vision systems

  • Authors:
  • David Weikersdorfer;Raoul Hoffmann;Jörg Conradt

  • Affiliations:
  • Neuroscientific System Theory, Technische Universität München, Germany;Neuroscientific System Theory, Technische Universität München, Germany;Neuroscientific System Theory, Technische Universität München, Germany

  • Venue:
  • ICVS'13 Proceedings of the 9th international conference on Computer Vision Systems
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a novel method for vision based simultaneous localization and mapping (vSLAM) using a biologically inspired vision sensor that mimics the human retina. The sensor consists of a 128x128 array of asynchronously operating pixels, which independently emit events upon a temporal illumination change. Such a representation generates small amounts of data with high temporal precision; however, most classic computer vision algorithms need to be reworked as they require full RGB(-D) images at fixed frame rates. Our presented vSLAM algorithm operates on individual pixel events and generates high-quality 2D environmental maps with precise robot localizations. We evaluate our method with a state-of-the-art marker-based external tracking system and demonstrate real-time performance on standard computing hardware.