A real-time tracker for markerless augmented reality

  • Authors:
  • Andrew I. Comport;Éric Marchand;François Chaumette

  • Affiliations:
  • -;-;-

  • Venue:
  • ISMAR '03 Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Augmented Reality has now progressed to the pointwhere real-time applications are being considered andneeded. At the same time it is important that synthetic elementsare rendered and aligned in the scene in an accurateand visually acceptable way. In order to address these issuesa real-time, robust and efficient 3D model-based trackingalgorithm is proposed for a 'video see through' monocularvision system. The tracking of objects in the sceneamounts to calculating the pose between the camera andthe objects. Virtual objects can then be projected into thescene using the pose. Here, non-linear pose computation isformulated by means of a virtual visual servoing approach.In this context, the derivation of point-to-curves interactionmatrices are given for different features including lines, circles,cylinders and spheres. A local moving edges tracker isused in order to provide real-time tracking of points normalto the object contours. A method is proposed for combininglocal position uncertainty and global pose uncertainty inan efficient and accurate way by propagating uncertainty.Robustness is obtained by integrating a M-estimator intothe visual control law via an iteratively re-weighted leastsquares implementation. The method presented in this paperhas been validated on several complex image sequencesincluding outdoor environments. Results show the methodto be robust to occlusion, changes in illumination and miss-tracking.