Monocular pedestrian tracking from a moving vehicle

  • Authors:
  • Zipei Fan;Zeliang Wang;Jinshi Cui;Franck Davoine;Huijing Zhao;Hongbin Zha

  • Affiliations:
  • Key Laboratory of Machine Perception(Ministry of Education), Peking University, Peking, China;Key Laboratory of Machine Perception(Ministry of Education), Peking University, Peking, China;Key Laboratory of Machine Perception(Ministry of Education), Peking University, Peking, China;Key Laboratory of Machine Perception(Ministry of Education), Peking University, Peking, China;Key Laboratory of Machine Perception(Ministry of Education), Peking University, Peking, China;Key Laboratory of Machine Perception(Ministry of Education), Peking University, Peking, China

  • Venue:
  • ACCV'12 Proceedings of the 11th international conference on Computer Vision - Volume 2
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Tracking of pedestrians from a moving vehicle equipped with a monocular camera is still considered as a challenging problem in the fields of both computer vision and robotics. In this paper, we address this problem in a particle filter framework, which well incorporates different cues from detector, dynamic model and target-specific tracking. In order to eliminate the effect of ego-motion when predicting the movement of pedestrians, we train one dynamic model for each driving behavior (moving forward, turning left/right) given a set of training trajectories. The learnt dynamic model is then utilized to predict the future movement of the pedestrian in the tracking process. We demonstrate our system works robustly on challenging dataset with strong illumination changes.