Sensor fusion-based visual target tracking for autonomous vehicles with the out-of-sequence measurements solution

  • Authors:
  • Zhen Jia;Arjuna Balasuriya;Subhash Challa

  • Affiliations:
  • School of EEE, Nanyang Technological University, Singapore 639798, Singapore;School of EEE, Nanyang Technological University, Singapore 639798, Singapore;Faculty of Engineering, The University of Technology, Sydney, Australia

  • Venue:
  • Robotics and Autonomous Systems
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a novel algorithm is proposed for the visual target tracking by Autonomous Guide Vehicles (AGV). This paper proposes a sensor data fusion system to estimate the dynamics of the target. Optical flow vectors, colour features, stereo pair disparities are used as the visual features while the vehicle's inertial measurements are used to estimate the stereo cameras' motion. The algorithm estimates the velocity and position of the target which is then used by the vehicle to track the target. In this sensor data fusion-based tracking system, the measurements from the same target can arrive out of sequence. This is called the ''Out-Of-Sequence'' Measurements (OOSM) problem. Thus the resulting problem - how to update the current state estimate with an ''older'' measurement - needs to be solved. In this paper the 1-step-lag OOSM solution from Bar-Shalom is applied for the Extended Kalman Filter-based target-state estimation. The performance of the proposed tracking algorithm with the OOSM solution is demonstrated through extensive experimental results.