Piecewise affine kernel tracking for non-planar targets
Pattern Recognition
A spatial-color mean-shift object tracking algorithm with scale and orientation estimation
Pattern Recognition Letters
Real-Time Object Tracking for Augmented Reality Combining Graph Cuts and Optical Flow
ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
A swarm-intelligence based algorithm for face tracking
International Journal of Intelligent Systems Technologies and Applications
Kernel-Bayesian framework for object tracking
ACCV'07 Proceedings of the 8th Asian conference on Computer vision - Volume Part I
Online visual tracking with histograms and articulating blocks
Computer Vision and Image Understanding
Joint feature correspondences and appearance similarity for robust visual object tracking
IEEE Transactions on Information Forensics and Security
Discriminative spatial attention for robust tracking
ECCV'10 Proceedings of the 11th European conference on Computer vision: Part I
Hi-index | 0.00 |
We present a tunable representation for tracking that simultaneously encodes appearance and geometry in a manner that enables the use of mean-shift iterations for tracking. The classic formulation of the tracking problem using mean-shift iterations encodes spatial information very loosely (i.e. using radially symmetric kernels). A problem with such a formulation is that it becomes easy for the tracker to get confused with other objects having the same feature distribution but different spatial configurations of features. Subsequent approaches have addressed this issue but not to the degree of generality required for tracking specific classes of objects and motions (e.g. humans walking). In this paper, we formulate the tracking problem in a manner that encodes the spatial configuration of features along with their density and yet retains robustness to spatial deformations and feature density variations. The encoding of spatial configuration is done using a set of kernels whose parameters can be optimized for a given class of objects and motions, off-line. The formulation enables the use of meanshift iterations and runs in real-time. We demonstrate better tracking results on synthetic and real image sequences as compared to the original mean-shift tracker.