Global annotation on georeferenced photographs

  • Authors:
  • Jim Kleban;Emily Moxley;Jiejun Xu;B. S. Manjunath

  • Affiliations:
  • Vision Research Lab, Santa Barbara, CA;Vision Research Lab, Santa Barbara, CA;Vision Research Lab, Santa Barbara, CA;Vision Research Lab, Santa Barbara, CA

  • Venue:
  • Proceedings of the ACM International Conference on Image and Video Retrieval
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present an efficient world-scale system for providing automatic annotation on collections of geo-referenced photos. As a user uploads a photograph a place of origin is estimated from visual features which the user can refine. Once the correct location is provided, tags are suggested based on geographic and image similarity retrieved from a large database of 1.2 million images crawled from Flickr. The system effectively mines geographically relevant terms and ranks potential suggestion terms by their posterior probability given observed visual and geocoordinate features. A series of experiments analyzes the geocoordinate prediction accuracy and precision-recall metric of tags suggestions based on information retrieval techniques. The system is novel in that it fuses geographic and visual information to provide annotations for uploaded photographs taken anywhere in the world in a matter of seconds.