Memory-based particle filter for tracking objects with large variation in pose and appearance

  • Authors:
  • Dan Mikami;Kazuhiro Otsuka;Junji Yamato

  • Affiliations:
  • NTT Communication and Science Laboratories;NTT Communication and Science Laboratories;NTT Communication and Science Laboratories

  • Venue:
  • ECCV'10 Proceedings of the 11th European conference on computer vision conference on Computer vision: Part III
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

A novel memory-based particle filter is proposed to achieve robust visual tracking of a target's pose even with large variations in target's position and rotation, i.e. large appearance changes. The memorybased particle filter (M-PF) is a recent extension of the particle filter, and incorporates a memory-based mechanism to predict prior distribution using past memory of target state sequence; it offers robust target tracking against complex motion. This paper extends the M-PF to a unified probabilistic framework for joint estimation of the target's pose and appearance based on memory-based joint prior prediction using stored past pose and appearance sequences. We call it the Memory-based Particle Filter with Appearance Prediction (M-PFAP). A memory-based approach enables generating the joint prior distribution of pose and appearance without explicit modeling of the complex relationship between them. M-PFAP can robustly handle the large changes in appearance caused by large pose variation, in addition to abrupt changes in moving direction; it allows robust tracking under self and mutual occlusion. Experiments confirm that M-PFAP successfully tracks human faces from frontal view to profile view; it greatly eases the limitations of M-PF.