Distributed vision-aided cooperative localization and navigation based on three-view geometry

  • Authors:
  • Vadim Indelman;Pini Gurfil;Ehud Rivlin;Hector Rotstein

  • Affiliations:
  • Faculty of Aerospace Engineering Technion - Israel Institute of Technology, Haifa 32000, Israel;Faculty of Aerospace Engineering Technion - Israel Institute of Technology, Haifa 32000, Israel;Department of Computer Science Technion - Israel Institute of Technology, Haifa 32000, Israel;RAFAEL - Advanced Defense Systems Limited, Haifa 31021, Israel

  • Venue:
  • AERO '11 Proceedings of the 2011 IEEE Aerospace Conference
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a new method for distributed vision-aided cooperative localization and navigation for multiple autonomous platforms based on constraints stemming from the three-view geometry of a general scene. Each platform is assumed to be equipped with a standard inertial navigation system and an on-board, possibly gimbaled, camera. The platforms are also assumed to be capable of intercommunicating. No other sensors, or any a priori information is required. In contrast to the traditional approach for cooperative localization that is based on relative pose measurements, the proposed method formulates a measurement whenever the same scene is observed by different platforms. Each such measurement is constituted upon three images, which are not necessarily captured at the same time. The captured images, attached with some navigation parameters, are stored in repositories by each, or some, of the platforms in the group. A graph-based approach is applied for calculating the correlation terms between the navigation parameters associated to images participating in the same measurement. The proposed method is examined using a statistical simulation in a leader-follower scenario, and is demonstrated in an experiment that involved two vehicles in a holding pattern scenario.