Estimation level fusion in multisensor environment

  • Authors:
  • Vladimir Shin;Rashid Minhas;Georgy Shevlyakov;Kiseon Kim

  • Affiliations:
  • Department of Mechatronics, Gwangju Institute of Science and Technology, Gwangju, Republic of Korea;Department of Mechatronics, Gwangju Institute of Science and Technology, Gwangju, Republic of Korea;Department of Information and Communications, Gwangju Institute of Science and Technology, Gwangju, Republic of Korea;Department of Information and Communications, Gwangju Institute of Science and Technology, Gwangju, Republic of Korea

  • Venue:
  • ISPRA'06 Proceedings of the 5th WSEAS International Conference on Signal Processing, Robotics and Automation
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

The integration and fusion of information, from a combination of different types of instruments (sensors), is often used in the design of control systems. Typical applications that can benefit the use of multiple sensors are industrial tasks, military command, mobile robot navigation, multi-target tracking, and aircraft navigation. In recent years, there has been growing interest to fuse multisensor data to increase the accuracy of estimation parameters and system states. This interest is motivated by the availability of different types of sensors having different spectrum characteristics. The observations, used in the estimation process, are assigned to a common target through association process. If it is decided that all local sensors observe the same target, then the next problem is how to combine (fuse) the corresponding local estimates? A new algorithm, for estimation level fusion, is proposed that uses optimal mean square combination of arbitrary number of local estimates. Local estimates are produced by applying Kalman filter on individual sensor measurements.