A video-based augmented reality golf simulator
MULTIMEDIA '00 Proceedings of the eighth ACM international conference on Multimedia
First Person Indoor/Outdoor Augmented Reality Application: ARQuake
Personal and Ubiquitous Computing
Visual registration for unprepared augmented reality environments
Personal and Ubiquitous Computing
Robust Vision-Based Registration Utilizing Bird's-Eye View with User's View
ISMAR '03 Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality
A real-time tracker for markerless augmented reality
ISMAR '03 Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality
A Head Tracking Method Using Bird's-Eye View Camera and Gyroscope
ISMAR '04 Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality
Efficient Information-based Visual Robotic Mapping in Unstructured Environments
International Journal of Robotics Research
Visually Guided Cooperative Robot Actions Based on Information Quality
Autonomous Robots
Calibration Errors in Augmented Reality: A Practical Study
ISMAR '05 Proceedings of the 4th IEEE/ACM International Symposium on Mixed and Augmented Reality
Intelligent and Efficient Strategy for Unstructured Environment Sensing Using Mobile Robot Agents
Journal of Intelligent and Robotic Systems
Real-Time Markerless Tracking for Augmented Reality: The Virtual Visual Servoing Framework
IEEE Transactions on Visualization and Computer Graphics
Robust line tracking using a particle filter for camera pose estimation
Proceedings of the ACM symposium on Virtual reality software and technology
Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR
ISMAR '08 Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
IVIC '09 Proceedings of the 1st International Visual Informatics Conference on Visual Informatics: Bridging Research and Practice
Hi-index | 0.00 |
Vision-based tracking systems have advantages for augmented reality (AR) applications. Their registration can be very accurate, and there is no delay between the motions of real and virtual scene elements. However, vision-based tracking often suffers from limited range, intermittent errors, and dropouts. These shortcomings are due to the need to see multiple calibrated features or fiducials in each frame. To address these shortcomings, features in the scene can be dynamically calibrated and pose calculations can be made robust to noise and numerical instability. In this paper, we survey classic vision-based pose computations and present two methods that offer increased robustness and accuracy in the context of real-time AR tracking.