Inertial-aided KLT feature tracking for a moving camera

  • Authors:
  • Myung Hwangbo;Jun-Sik Kim;Takeo Kanade

  • Affiliations:
  • Robotics Institute, School of Computer Science, Carnegie Mellon University, Pittsburgh;Robotics Institute, School of Computer Science, Carnegie Mellon University, Pittsburgh;Robotics Institute, School of Computer Science, Carnegie Mellon University, Pittsburgh

  • Venue:
  • IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a novel inertial-aided KLT feature tracking method robust to camera ego-motions. The conventional KLT uses images only and its working condition is inherently limited to small appearance change between images. When big optical flows are induced by a camera-ego motion, an inertial sensor attached to the camera can provide a good prediction to preserve the tracking performance. We use a low-grade MEMS-based gyroscope to refine an initial condition of the nonlinear optimization in the KLT. It increases the possibility for warping parameters to be in the convergence region of the KLT. For longer tracking with less drift, we use the affine photometric model and it can effectively deal with camera rolling and outdoor illumination change. Extra computational cost caused by this higher-order motion model is alleviated by restraining the Hessian update and GPU acceleration. Experimental results are provided for both indoor and outdoor scenes and GPU implementation issues are discussed.