A new space and time sensor fusion method for mobile robot navigation

  • Authors:
  • TaeSeok Jin;JangMyung Lee;S. K. Tso

  • Affiliations:
  • RICIC, Department of Electronics Engineering, Pusan National University, Pusan, 609-735, Korea;RICIC, Department of Electronics Engineering, Pusan National University, Pusan, 609-735, Korea;CIDAM, Department of Manufacturing Engineering & Engineering Management, City University of Hong Kong, Hong Kong

  • Venue:
  • Journal of Robotic Systems
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

To fully utilize the information from the sensors of mobile robot, this paper proposes a new sensor-fusion technique where the sample data set obtained at a previous instant is properly transformed and fused with the current data sets to produce a reliable estimate for navigation control. Exploration of an unknown environment is an important task for the new generation of mobile service robots. The mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. Notice that in the conventional fusion schemes, the measurement is dependent on the current data sets only. Therefore, more sensors are required to measure a given physical parameter or to improve the reliability of the measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequences of the data sets are stored and utilized for the purpose. The basic principle is illustrated by examples and the effectiveness is proved through simulations and experiments. The newly proposed STSF (space and time sensor fusion) scheme is applied to the navigation of a mobile robot in an environment using landmarks, and the experimental results demonstrate the effective performance of the system. © 2004 Wiley Periodicals, Inc.