Multisensory embedded pose estimation

  • Authors:
  • Eyrun Eyjolfsdottir;Matthew Turk

  • Affiliations:
  • Computer Science Department, UC Santa Barbara;Computer Science Department, UC Santa Barbara

  • Venue:
  • WACV '11 Proceedings of the 2011 IEEE Workshop on Applications of Computer Vision (WACV)
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a multisensory method for estimating the transformation of a mobile phone between two images taken from its camera. Pose estimation is a necessary step for applications such as 3D reconstruction and panorama construction, but detecting and matching robust features can be computationally expensive. In this paper we propose a method for combining the inertial sensors (accelerometers and gyroscopes) of a mobile phone with its camera to provide a fast and accurate pose estimation. We use the inertial based pose to warp two images into the same perspective frame. We then employ an adaptive FAST feature detector and image patches, normalized with respect to illumination, as feature descriptors. After the warping the images are approximately aligned with each other so the search for matching key-points also becomes faster and in certain cases more reliable. Our results show that by incorporating the inertial sensors we can considerably speed up the process of detecting and matching key-points between two images, which is the most time consuming step of the pose estimation.