IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust Fragments-based Tracking using the Integral Histogram
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
IEEE Transactions on Pattern Analysis and Machine Intelligence
Incremental Learning for Robust Visual Tracking
International Journal of Computer Vision
Segmentation and Tracking of Multiple Humans in Crowded Environments
IEEE Transactions on Pattern Analysis and Machine Intelligence
SIFT Flow: Dense Correspondence across Different Scenes
ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part III
Simultaneous local and global state estimation for robotic navigation
ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
Moving obstacle detection in highly dynamic scenes
ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
Efficient visual object tracking with online nearest neighbor classifier
ACCV'10 Proceedings of the 10th Asian conference on Computer vision - Volume Part I
Robust Object Tracking with Online Multiple Instance Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust Visual Tracking and Vehicle Classification via Sparse Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part III
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
The destination of a traditional robot navigation task is usually a static location. However, many real life applications require a robot to continuously identify and find its way toward a non-static target, e.g., following a walking person. In this paper, we present a navigation framework for this task which is based on simultaneous navigation and tracking. It consists of iterations of data acquiring, perception/cognition and motion executing. In the perception/cognition step, visual tracking is introduced to keep track of the target object. This setting is much more challenging than regular tracking tasks, because the target object shows much larger variance in location, shape and size in consecutive images acquired while navigating. A Footprint Detection based Tracker (FD-Tracker) is proposed to robustly track the target object in such scenarios. We first perform object footprint detection in the plan-view map to grasp possible target locations. The information is then fused into a Bayesian tracking framework to prune target candidates. As compared to previous methods, our results demonstrate that using footprint can boost the performance of visual tracker. Promising experimental results of navigating a robot to various goals in an office environment further proofs the robustness of our navigation framework.