Dynamic context for tracking behind occlusions
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part V
Hi-index | 0.00 |
This paper considers the problem of sustained multicamera tracking in the presence of occlusion and changes in the target motion model. The key insight of the proposed method is the fact that, under mild conditions, the 2D trajectories of the target in the image planes of each of the cameras are constrained to evolve in the same subspace. This observation allows for identifying, at each time instant, a single (piecewise) linear model that explains all the available 2D measurements. In turn, this model can be used in the context of a modified particle filter to predict future target locations. In the case where the target is occluded to some of the cameras, the missing measurements can be estimated using the facts that they must lie both in the subspace spanned by previous measurements and satisfy epipolar constraints. Hence, by exploiting both dynamical and geometrical constraints the proposed method can robustly handle substantial occlusion, without the need for performing 3D reconstruction, calibrated cameras or constraints on sensor separation. The performance of the proposed tracker is illustrated with several challenging examples involving targets that substantially change appearance and motion models while occluded to some of the cameras.