International Journal of Computer Vision
A Flexible New Technique for Camera Calibration
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Hybrid Registration Method for Outdoor Augmented Reality
ISAR '01 Proceedings of the IEEE and ACM International Symposium on Augmented Reality (ISAR'01)
First Steps Towards Handheld Augmented Reality
ISWC '03 Proceedings of the 7th IEEE International Symposium on Wearable Computers
Vision and Inertial Sensor Cooperation Using Gravity as a Vertical Reference
IEEE Transactions on Pattern Analysis and Machine Intelligence
Lucas-Kanade 20 Years On: A Unifying Framework
International Journal of Computer Vision
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
Sensor Fusion and Occlusion Refinement for Tablet-Based AR
ISMAR '04 Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality
Video See-Through AR on Consumer Cell-Phones
ISMAR '04 Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality
Matching with PROSAC " Progressive Sample Consensus
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
A Performance Evaluation of Local Descriptors
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Hybrid and Linear Registration Method Utilizing Inclination Constraint
ISMAR '05 Proceedings of the 4th IEEE/ACM International Symposium on Mixed and Augmented Reality
A Comparison of Affine Region Detectors
International Journal of Computer Vision
Going out: robust model-based tracking for outdoor augmented reality
ISMAR '06 Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality
A Fast Initialization Method for Edge-based Registration Using an Inclination Constraint
ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
Pose tracking from natural features on mobile phones
ISMAR '08 Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
An iterative image registration technique with an application to stereo vision
IJCAI'81 Proceedings of the 7th international joint conference on Artificial intelligence - Volume 2
A dataset and evaluation methodology for template-based tracking algorithms
ISMAR '09 Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality
Shape recognition and pose estimation for mobile augmented reality
ISMAR '09 Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality
Rolling and shooting: two augmented reality games
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Learning Real-Time Perspective Patch Rectification
International Journal of Computer Vision
Multisensory embedded pose estimation
WACV '11 Proceedings of the 2011 IEEE Workshop on Applications of Computer Vision (WACV)
Evaluation of Interest Point Detectors and Feature Descriptors for Visual Tracking
International Journal of Computer Vision
Gravity-aware handheld Augmented Reality
ISMAR '11 Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality
Leveraging 3D City Models for Rotation Invariant Place-of-Interest Recognition
International Journal of Computer Vision
Inertial sensor-aligned visual feature descriptors
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Video-Based In Situ Tagging on Mobile Phones
IEEE Transactions on Circuits and Systems for Video Technology
Editorial: Foreword to special section on augmented reality
Computers and Graphics
Hi-index | 0.00 |
This article is a revised version of an earlier work on Gravity-Aware Handheld Augmented Reality (AR) (Kurz and Benhimane, 2011 [1]), which investigates how different stages in handheld AR applications can benefit from knowing the direction of the gravity measured with inertial sensors. It presents approaches to improve the description and matching of feature points, detection and tracking of planar templates, and the visual quality of the rendering of virtual 3D objects by incorporating the gravity vector. In handheld AR, both the camera and the display are located in the user's hand and therefore can be freely moved. The pose of the camera is generally determined with respect to piecewise planar objects that have a static and known orientation with respect to gravity. In the presence of (close to) vertical surfaces, we show how Gravity-Aligned Feature Descriptors (GAFDs) improve the initialization of tracking algorithms relying on feature point descriptor-based approaches in terms of quality and performance. For (close to) horizontal surfaces, we propose to use the gravity vector to rectify the camera image and detect and describe features in the rectified image. The resulting Gravity-Rectified Feature Descriptors (GREFDs) provide an improved precision-recall characteristic and enable faster initialization, in particular under steep viewing angles. Gravity-rectified camera images also allow for real-time 6 DoF pose estimation using an edge-based object detection algorithm handling only 4 DoF similarity transforms. Finally, the rendering of virtual 3D objects can be made more realistic and plausible by taking into account the orientation of the gravitational force in addition to the relative pose between the handheld device and a real object. In comparison to the original paper, this work provides a more elaborate evaluation of the presented algorithms. We propose a method enabling the evaluation of inertial-sensor aided visual tracking methods without real inertial sensor data. By synthesizing gravity measurements from ground truth camera poses, we benchmark our algorithms on a large existing dataset. Based on this approach, we also develop and evaluate a gravity-adaptive approach that performs image-rectification only when beneficial.