Digilog miniature: real-time, immersive, and interactive AR on miniatures
Proceedings of the 9th ACM SIGGRAPH Conference on Virtual-Reality Continuum and its Applications in Industry
A tracking framework for augmented reality tours on cultural heritage sites
Proceedings of the 9th ACM SIGGRAPH Conference on Virtual-Reality Continuum and its Applications in Industry
Hi-index | 0.00 |
Model-based camera tracking is a technology that estimates a precise camera pose based on visual cues (e.g., feature points, edges) extracted from camera images given a 3D scene model and a rough camera pose. This paper proposes an automatic method for flexibly adjusting the confidence of visual cues in model-based camera tracking. The adjustment is based on the conditions of the target object-scene and the reliability of the initial or previous camera pose. Under uncontrolled or less-controlled working environments, the proposed object-adaptive tracking method works flexibly at 20 frames per second on an ultra mobile personal computer (UMPC) with an average tracking error within 3 pixels when the camera image resolution is 320 by 240 pixels. This capability enabled the proposed method to be successfully applied to a mobile augmented reality (AR) guidance system for a museum. Copyright © 2009 John Wiley & Sons, Ltd. Object-adaptive camera tracking. The red wire lines represent the 3D graphic model of the objects. The first-row images are the initial tracking results by ultrasonic and inertial sensors. The second-, third-, and fourth-row images are the results when the η values are 0, 0.3, and 1, respectively. The images marked with black boxes are the results by the object-adaptive tracking method, where the η values are automatically adjusted to the optimal value for each object.