Outdoors augmented reality on mobile phone using loxel-based visual feature organization

  • Authors:
  • Gabriel Takacs;Vijay Chandrasekhar;Natasha Gelfand;Yingen Xiong;Wei-Chao Chen;Thanos Bismpigiannis;Radek Grzeszczuk;Kari Pulli;Bernd Girod

  • Affiliations:
  • Stanford University, Stanford, CA, USA;Stanford University, Stanford, CA, USA;Nokia Research Center, Palo Alto, CA, USA;Nokia Research Center, Palo Alto, CA, USA;Nokia Research Center, Palo Alto, CA, USA;Stanford University, Stanford, CA, USA;Nokia Research Center, Palo Alto, CA, USA;Nokia Research Center, Palo Alto, CA, USA;Stanford University, Stanford, CA, USA

  • Venue:
  • MIR '08 Proceedings of the 1st ACM international conference on Multimedia information retrieval
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We have built an outdoors augmented reality system for mobile phones that matches camera-phone images against a large database of location-tagged images using a robust image retrieval algorithm. We avoid network latency by implementing the algorithm on the phone and deliver excellent performance by adapting a state-of-the-art image retrieval algorithm based on robust local descriptors. Matching is performed against a database of highly relevant features, which is continuously updated to reflect changes in the environment. We achieve fast updates and scalability by pruning of irrelevant features based on proximity to the user. By compressing and incrementally updating the features stored on the phone we make the system amenable to low-bandwidth wireless connections. We demonstrate system robustness on a dataset of location-tagged images and show a smart-phone implementation that achieves a high image matching rate while operating in near real-time.