Outdoor autonomous navigation using SURF features

  • Authors:
  • Masayoshi Tabuse;Toshiki Kitaoka;Dai Nakai

  • Affiliations:
  • Graduate School of Life and Environmental Science, Kyoto Prefectural University, Kyoto, Japan 606-8522;X-TRANS, Osaka, Japan;Kyoto Prefectural Subaru High School, Kyoto, Japan

  • Venue:
  • Artificial Life and Robotics
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this article, we propose a speeded-up robust features (SURF)-based approach for outdoor autonomous navigation. In this approach, we capture environmental images using an omni-directional camera and extract features of these images using SURF. We treat these features as landmarks to estimate a robot's self-location and direction of motion. SURF features are invariant under scale changes and rotation, and are robust under image noise, changes in light conditions, and changes of viewpoint. Therefore, SURF features are appropriate for the self-location estimation and navigation of a robot. The mobile robot navigation method consists of two modes, the teaching mode and the navigation mode. In the teaching mode, we teach a navigation course. In the navigation mode, the mobile robot navigates along the teaching course autonomously. In our experiment, the outdoor teaching course was about 150 m long, the average speed was 2.9 km/h, and the maximum trajectory error was 3.3 m. The processing time of SURF was several times shorter than that of scale-invariant feature transform (SIFT). Therefore, the navigation speed of the mobile robot was similar to the walking speed of a person.