Visual navigation of mobile robot using optical flow and visual potential field

  • Authors:
  • Naoya Ohnishi;Atsushi Imiya

  • Affiliations:
  • Graduate School of Science and Technology, Chiba University, Chiba, Japan;Institute of Media and Information Technology, Chiba University, Chiba, Japan

  • Venue:
  • RobVis'08 Proceedings of the 2nd international conference on Robot vision
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we develop a novel algorithm for navigating a mobile robot using the visual potential. The visual potential is computed from an image sequence and optical flow computed from successive images captured by the camera mounted on the robot. We assume that the direction to the destination is provided at the initial position of the robot. Using the direction to the destination, the robot dynamically selects a local pathway to the destination without collision with obstacles. The proposed algorithm does no require any knowledge or environmental maps of the robot workspace. Furthermore, this algorithm uses only a monocular uncalibrated camera for detecting a feasible region of navigation, since we apply the dominant plane detection to detect the feasible region. We present the experimental results of navigation in synthetic and real environments. Additionally, we present the robustness evaluation of optical flow computation against lighting effects and various kinds of textures.