Rotation estimation and vanishing point extraction by omnidirectional vision in urban environment

  • Authors:
  • Jean-Charles Bazin;Cédric Demonceaux;Pascal Vasseur;Inso Kweon

  • Affiliations:
  • Ikeuchi Laboratory, Institute of Industrial Science, The University of Tokyo, Tokyo, Japan;Le2i UMR 5158, University of Burgundy, France;LITIS EA 4108, University of Rouen, France;RCV Lab, KAIST, Korea

  • Venue:
  • International Journal of Robotics Research
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Rotation estimation is a fundamental step for various robotic applications such as automatic control of ground/aerial vehicles, motion estimation and 3D reconstruction. However it is now well established that traditional navigation equipments, such as global positioning systems (GPSs) or inertial measurement units (IMUs), suffer from several disadvantages. Hence, some vision-based works have been proposed recently. Whereas interesting results can be obtained, the existing methods have non-negligible limitations such as a difficult feature matching (e.g. repeated textures, blur or illumination changes) and a high computational cost (e.g. analyze in the frequency domain). Moreover, most of them utilize conventional perspective cameras and thus have a limited field of view. In order to overcome these limitations, in this paper we present a novel rotation estimation approach based on the extraction of vanishing points in omnidirectional images. The first advantage is that our rotation estimation is decoupled from the translation computation, which accelerates the execution time and results in a better control solution. This is made possible by our complete framework dedicated to omnidirectional vision, whereas conventional vision has a rotation/translation ambiguity. Second, we propose a top-down approach which maintains the important constraint of vanishing point orthogonality by inverting the problem: instead of performing a difficult line clustering preliminary step, we directly search for the orthogonal vanishing points. Finally, experimental results on various data sets for diverse robotic applications have demonstrated that our novel framework is accurate, robust, maintains the orthogonality of the vanishing points and can run in real-time.