Three-dimensional motion and structure estimation using inertial sensors and computer vision for augmented reality

  • Authors:
  • Lin Chai;William A. Hoff;Tyrone Vincent

  • Affiliations:
  • Xilinx, Inc. 2100 Logic Drive, San Jose, CA;Engineering Division, Colorado School of Mines, Golden, CO;Engineering Division, Colorado School of Mines, Golden, CO

  • Venue:
  • Presence: Teleoperators and Virtual Environments
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new method for registration in augmented reality (AR) was developed that simultaneously tracks the position, orientation, and motion of the user's head, as well as estimating the three-dimensional (3D) structure of the scene. The method fuses data from head-mounted cameras and head-mounted inertial sensors. Two extended Kalman filters (EKFs) are used: one estimates the motion of the user's head and the other estimates the 3D locations of points in the scene. A recursive loop is used between the two EKFs. The algorithm was tested using a combination of synthetic and real data, and in general was found to perform well. A further test showed that a system using two cameras performed much better than a system using a single camera, although improving the accuracy of the inertial sensors can partially compensate for the loss of one camera. The method is suitable for use in completely unstructured and unprepared environments. Unlike previous work in this area, this method requires no a priori knowledge about the scene, and can work in environments in which the objects of interest are close to the user.