An EyeTap video-based featureless projective motion estimation assisted by gyroscopic tracking for wearable computer mediated reality

  • Authors:
  • Chris Aimone;James Fung;Steve Mann

  • Affiliations:
  • University of Toronto, 10 King's College Road, Toronto, Canada;University of Toronto, 10 King's College Road, Toronto, Canada;University of Toronto, 10 King's College Road, Toronto, Canada

  • Venue:
  • Personal and Ubiquitous Computing
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we present a computationally economical method of recovering the projective motion of head mounted cameras or EyeTap devices, for use in wearable computer-mediated reality. The tracking system combines featureless vision and inertial methods in a closed loop system to achieve accurate robust head tracking using inexpensive sensors. The combination of inertial and vision techniques provides the high accuracy visual registration needed for fitting computer graphics onto real images and the robustness to large interframe camera motion due to fast head rotations. Operating on a 1.2 GHz Pentium III wearable computer with graphics accelerated hardware, the system is able to register live video images with less than 2 pixels of error (0.3 degrees) at 12 frames per second. Fast image registration is achieved by offloading computer vision computation onto the graphics hardware, which is readily available on many wearable computer systems. As an application of this tracking approach, we present a system which allows wearable computer users to share views of their current environments that have been stabilised to another viewer's head position.