Real-time non-rigid shape recovery via active appearance models for augmented reality

  • Authors:
  • Jianke Zhu;Steven C. H. Hoi;Michael R. Lyu

  • Affiliations:
  • Department of Computer Science & Engineering, Chinese University of Hong Kong, Shatin, Hong Kong;Department of Computer Science & Engineering, Chinese University of Hong Kong, Shatin, Hong Kong;Department of Computer Science & Engineering, Chinese University of Hong Kong, Shatin, Hong Kong

  • Venue:
  • ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

One main challenge in Augmented Reality (AR) applications is to keep track of video objects with their movement, orientation, size, and position accurately. This poses a challenging task to recover non-rigid shape and global pose in real-time AR applications. This paper proposes a novel two-stage scheme for online non-rigid shape recovery toward AR applications using Active Appearance Models (AAMs). First, we construct 3D shape models from AAMs offline, which do not involve processing of the 3D scan data. Based on the computed 3D shape models, we propose an efficient online algorithm to estimate both 3D pose and non-rigid shape parameters via local bundle adjustment for building up point correspondences. Our approach, without manual intervention, can recover the 3D non-rigid shape effectively from either real-time video sequences or single image. The recovered 3D pose parameters can be used for AR registrations. Furthermore, the facial feature can be tracked simultaneously, which is critical for many face related applications. We evaluate our algorithms on several video sequences. Promising experimental results demonstrate our proposed scheme is effective and significant for real-time AR applications.