An Adaptive Fusion Architecture for Target Tracking

  • Authors:
  • Gareth Loy;Luke Fletcher;Nicholas Apostoloff;Alexander Zelinsky

  • Affiliations:
  • -;-;-;-

  • Venue:
  • FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

A vision system is demonstrated that adaptively allocates computational resources over multiple cues to robustly track a target in 3D. The system uses a particle filter to maintain multiple hypotheses of the target location. Bayesian probability theory provides the framework for sensor fusion, and resource scheduling is used to intelli-gently allocate the limited computational resources available across the suite of cues. The system is shown to track a person in 3D space moving in a cluttered environment.