Real-Time Object Tracking for Augmented Reality Combining Graph Cuts and Optical Flow

  • Authors:
  • Jonathan Mooser;Suya You;Ulrich Neumann

  • Affiliations:
  • CGIT Lab, University of Southern California. e-mail: mooser@graphics.usc.edu;CGIT Lab, University of Southern California. suyay@graphics.usc.edu;CGIT Lab, University of Southern California. uneumann@graphics.usc.edu

  • Venue:
  • ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present an efficient and accurate object tracking algorithm based on the concept of graph cut segmentation. The ability to track visible objects in real-time provides an invaluable tool for the implementation of markerless Augmented Reality. Once an object has been detected, it's location in future frames can be used to position virtual content, and thus annotate the environment. Unlike many object tracking algorithms, our approach does not rely on a preexisting 3D model or any other information about the object or its environment. It takes, as input, a set of pixels representing an object in an initial frame and uses a combination of optical flow and graph cut segmentation to determine the corresponding pixels in each future frame. Experiments show that our algorithm robustly tracks objects of disparate shapes and sizes over hundreds of frames, and can even handle difficult cases where an object contains many of the same colors as its background. We further show how this technology can be applied to practical AR applications.