Visual Homing: Surfing on the Epipoles

  • Authors:
  • Ronen Basri;Ehud Rivlin;Ilan Shimshoni

  • Affiliations:
  • Department of Applied Math, The Weizmann Institute of Science, Rehovot 76100, Israel;Department of Computer Science, The Technion, Haifa 32000, Israel;Department of Industrial Engineering and Management, The Technion, Haifa 32000, Israel

  • Venue:
  • International Journal of Computer Vision
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

We introduce a novel method for visual homing. Using this method arobot can be sent to desired positions and orientations in 3D spacespecified by single images taken from these positions. Our method isbased on recovering the epipolar geometry relating the current imagetaken by the robot and the target image. Using the epipolar geometry,most of the parameters which specify the differences in position andorientation of the camera between the two images are recovered.However, since not all of the parameters can be recovered from twoimages, we have developed specific methods to bypass these missingparameters and resolve the ambiguities that exist. We present two homing algorithms for two standard projection models, weak and fullperspective.Our method determines the path of the robot on-line, the startingposition of the robot is relatively not constrained, and a 3D modelof the environment is not required. The method is almost entirelymemoryless, in the sense that at every step the path to the targetposition is determined independently of the previous path taken by therobot. Because of this property the robot may be able, while movingtoward the target, to perform auxiliary tasks or to avoid obstacles,without this impairing its ability to eventually reach the targetposition. We have performed simulations and real experiments whichdemonstrate the robustness of the method and that the algorithmsalways converge to the target pose.