Angle-based homing from a reference image set using the 1D trifocal tensor

  • Authors:
  • M. Aranda;G. López-Nicolás;C. Sagüés

  • Affiliations:
  • Instituto de Investigación en Ingeniería de Aragón, Universidad de Zaragoza, Zaragoza, Spain;Instituto de Investigación en Ingeniería de Aragón, Universidad de Zaragoza, Zaragoza, Spain;Instituto de Investigación en Ingeniería de Aragón, Universidad de Zaragoza, Zaragoza, Spain

  • Venue:
  • Autonomous Robots
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a visual homing method for a robot moving on the ground plane. The approach employs a set of omnidirectional images acquired previously at different locations (including the goal position) in the environment, and the current image taken by the robot. We present as contribution a method to obtain the relative angles between all these locations, using the computation of the 1D trifocal tensor between views and an indirect angle estimation procedure. The tensor is particularly well suited for planar motion and provides important robustness properties to our technique. Another contribution of our paper is a new control law that uses the available angles, with no range information involved, to drive the robot to the goal. Therefore, our method takes advantage of the strengths of omnidirectional vision, which provides a wide field of view and very precise angular information. We present a formal proof of the stability of the proposed control law. The performance of our approach is illustrated through simulations and different sets of experiments with real images.