Appearance-based navigation and homing for autonomous mobile robot

  • Authors:
  • Naoya Ohnishi;Atsushi Imiya

  • Affiliations:
  • Graduate School of Science and Technology, Chiba University, Japan;Institute of Media and Information Technology, Chiba University, Japan

  • Venue:
  • Image and Vision Computing
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we develop an algorithm for navigating a mobile robot using the visual potential. The visual potential is computed from an image sequence and optical flow computed from successive images captured by a camera mounted on the robot, that is, the visual potential for navigation is computed from appearances of the workspace observed as an image sequence. The direction to the destination is provided at the initial position of the robot. The robot dynamically selects a local pathway to the destination without collision with obstacles and without any knowledge of the robot workspace. Furthermore, the guidance algorithm to destination allows the mobile robot to return from the destination to the initial position. We present the experimental results of navigation and homing in synthetic and real environments.