An energy-based vehicle tracking system using principal component analysis and unsupervised ART network

  • Authors:
  • A. Srikaew;P. Kumsawat;K. Attakitmongcol;N. Sroisuwan;C. Sotthithaworn

  • Affiliations:
  • Robotics & Automation Research Unit for Real-World Applications, School of Electrical Engineering, Suranaree University of Technology, Nakhon Ratchasima, Thailand;Robotics & Automation Research Unit for Real-World Applications, School of Electrical Engineering, Suranaree University of Technology, Nakhon Ratchasima, Thailand;Robotics & Automation Research Unit for Real-World Applications, School of Electrical Engineering, Suranaree University of Technology, Nakhon Ratchasima, Thailand;Robotics & Automation Research Unit for Real-World Applications, School of Electrical Engineering, Suranaree University of Technology, Nakhon Ratchasima, Thailand;Robotics & Automation Research Unit for Real-World Applications, School of Electrical Engineering, Suranaree University of Technology, Nakhon Ratchasima, Thailand

  • Venue:
  • AIKED'09 Proceedings of the 8th WSEAS international conference on Artificial intelligence, knowledge engineering and data bases
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This work presents an automatic vehicle detecting and tracking system from a sequence of images. The vehicle detection system uses energy-based images including symmetry energy, Gabor energy, and road energy, to initially locate vehicles in each image. The tracking system then utilizes the adaptive resonance theory network for vehicle recognition and tracking based on vehicle energy images. The vehicle energy images are fed into the network which can automatically recognize salient features of vehicles by analyzing theirs principal components. This unsupervised network allows the system to efficiently perform tracking in dynamic environments where shapes and sizes of vehicles are changing all the time. By using the vehicle energy model, the proposed system can also track multiple vehicles simultaneously, both frontal and rear view. Results and discussions are described.