A Computational Approach to Edge Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust model-based motion tracking through the integration of search and estimation
International Journal of Computer Vision
Pose estimation using point and line correspondences
Real-Time Imaging
Real-Time Visual Tracking of Complex Structures
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Affine Invariant Interest Point Detector
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part I
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
Combining Edge and Texture Information for Real-Time Accurate 3D Camera Tracking
ISMAR '04 Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality
MonoSLAM: Real-Time Single Camera SLAM
IEEE Transactions on Pattern Analysis and Machine Intelligence
Point matching as a classification problem for fast and robust object pose estimation
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
SURF: speeded up robust features
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
An information-theoretic framework for image complexity
Computational Aesthetics'05 Proceedings of the First Eurographics conference on Computational Aesthetics in Graphics, Visualization and Imaging
Analytic fusion of visual cues in model-based camera tracking
Proceedings of the 8th International Conference on Virtual Reality Continuum and its Applications in Industry
Hi-index | 0.00 |
This paper proposes a model-based object-adaptive tracking method which uses both edges and feature points as vision cues and flexibly adjusts the contribution of each vision cue using a single parameter based on the characteristics of tracking object and the initial conditions. It will be shown that, in many situations where conventional object tracking methods do not work, the proposed method provides reasonably good results. The proposed object-adaptive tracking method worked at 20 fps on UMPC with an average tracking error within 3 pixels when the camera image resolution is 640 by 480 pixels and this real-time capability enabled the proposed method to be successfully applied to an augmented reality (AR) guidance system for the National Science Museum.