Expanding Irregular Graph Pyramid for an Approaching Object
CIARP '09 Proceedings of the 14th Iberoamerican Conference on Pattern Recognition: Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
Motion analysis via feature point tracking technology
MMM'11 Proceedings of the 17th international conference on Advances in multimedia modeling - Volume Part II
Visual object tracking by an evolutionary self-organizing neural network
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology - Evolutionary neural networks for practical applications
A survey of appearance models in visual object tracking
ACM Transactions on Intelligent Systems and Technology (TIST) - Survey papers, special sections on the semantic adaptive social web, intelligent systems for health informatics, regular papers
Object tracking using learned feature manifolds
Computer Vision and Image Understanding
Hi-index | 0.00 |
Object tracking is one of the fundamental problems in computer vision and has received considerable attention in the past two decades. The success of a tracking algorithm relies on two key issues: 1) an effective representation so that the object being tracked can be distinguished from the background and other objects and 2) an update scheme of the object representation to accommodate object appearance and structure changes. Despite the progress made in the past, reliable and efficient tracking of objects with changing appearance remains a challenging problem. In this paper, a novel sparse, local feature-based object representation, the attributed relational feature graph, is proposed to solve this problem. The object is modeled using invariant features such as the scale-invariant feature transform and the geometric relations among features are encoded in the form of a graph. A dynamic model is developed to evolve the feature graph according to the appearance and structure changes by adding new stable features as well as removing inactive features. Extensive experiments show that our method can achieve reliable tracking even under significant appearance changes, view point changes, and occlusion.