Sensor Fusion and Planning with Perception–Action Network

  • Authors:
  • Sukhan Lee

  • Affiliations:
  • Jet Propulsion Laboratory, California Institute of Technology, Pasadena, California 91109, U.S.A. Depts. of EE-Systems and CS, University of Southern California, Los Angeles, CA 90089-0781, U.S.A.

  • Venue:
  • Journal of Intelligent and Robotic Systems
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

Robot intelligence requires a real-time connection between sensing and action. A new computation principle of robotics that efficiently implements such a connection is utmost important for the new generation of robotics. In this paper, a perception–action network is presented as a means of efficiently integrating sensing, knowledge, and action for sensor fusion and planning. The network consists of a number of heterogeneous computational units, representing feature transformation and decision-making for action, which are interconnected as a dynamic system. New input stimuli to the network invoke the evolution of network states to a new equilibrium, through which a real-time integration of sensing, knowledge, and action can be accomplished. The network provides a formal, yet general and efficient, method of achieving sensor fusion and planning. This is because the uncertainties of signals, propagated in the network, can be controlled by modifying sensing parameters and robot actions. Algorithms for sensor planning based on the proposed network are established and applied to robot self-localization. Simulation and experimental results are shown.