Discovering areas of interest with geo-tagged images and check-ins

  • Authors:
  • Jiajun Liu;Zi Huang;Lei Chen;Heng Tao Shen;Zhixian Yan

  • Affiliations:
  • The University of Queensland, Brisbane, Australia;The University of Queensland, Brisbane, Australia;Hong Kong University of Science and Technology, Hong Kong, Hong Kong;The University of Queensland, Brisbane, Australia;Ecole Polytechnique Federale de Lausanne, Lausanne, Switzerland

  • Venue:
  • Proceedings of the 20th ACM international conference on Multimedia
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Geo-tagged image is an ideal source for the discovery of popular travel places. However, the aspects of popular venues for daily-life purposes like dining and shopping are often missing in the mined locations from geo-tagged images. Fortunately check-in websites provide us a unique opportunity of analyzing people's preferences in their daily lives to complement the knowledge mined from geo-tagged images. This paper presents a novel approach for the discovery of Areas of Interest (AoI). By analyzing both geo-tagged images and check-ins, the approach exploits travelers' flavors as well as the preferences of daily-life activities of local residents to find AoI in a city. The proposed approach consists of two major steps. Firstly, we devise a density-based clustering method to discover AoI, mainly based on the image densities but also reinforced by the secondary densities from the images' neighboring venues. Then we propose a novel joint authority analysis framework to rank AoI. The framework simultaneously considers both the location-location transitions, and the user-location relations. An interactive presentation interface for visualizing AoI is also presented. The approach is tested with very large datasets for Shanghai city. They consist of 49,460 geo-tagged images from Panoramio.com, and 1,361,547 check-ins from the check-in website Qieke.com. By evaluating the ranking accuracy and quality of AoI, we demonstrate great improvements of our method over compared methods.