iScope: personalized multi-modality image search for mobile devices

  • Authors:
  • Changyun Zhu;Kun Li;Qin Lv;Li Shang;Robert P. Dick

  • Affiliations:
  • Queen's University, Kingston, ON, Canada;University of Colorado at Boulder , Boulder, CO, USA;University of Colorado at Boulder , Boulder, CO, USA;University of Colorado at Boulder, Boulder, CO, USA;University of Michigan, Ann Arbor, MI, USA

  • Venue:
  • Proceedings of the 7th international conference on Mobile systems, applications, and services
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Mobile devices are becoming a primary medium for personal information gathering, management, and sharing. Managing personal image data on mobile platforms is a difficult problem due to large data set size, content diversity, heterogeneous individual usage patterns, and resource constraints. This article presents a user-centric system, called iScope, for personal image management and sharing on mobile devices. iScope uses multi-modality clustering of both content and context information for efficient image management and search, and online learning techniques for predicting images of interest. It also supports distributed content-based search among networked devices while maintaining the same intuitive interface, enabling efficient information sharing among people. We have implemented iScope and conducted in-field experiments using networked Nokia N810 portable Internet tablets. Energy efficiency was a primary design focus during the design and implementation of the iScope search algorithms. Experimental results indicate that iScope improves search time and search energy by 4.1X and 3.8X on average, relative to browsing.