MediaScope: selective on-demand media retrieval from mobile devices

  • Authors:
  • Yurong Jiang;Xing Xu;Peter Terlecky;Tarek Abdelzaher;Amotz Bar-Noy;Ramesh Govindan

  • Affiliations:
  • University of Southern California, Los Angeles, CA, USA;University of Southern California, Los Angeles, CA, USA;City University of New York, New York, NY, USA;University of Illinois at Urbana-Champaign, Urbana, IL, USA;City University of New York, New York, NY, USA;University of Southern California, Los Angeles, CA, USA

  • Venue:
  • Proceedings of the 12th international conference on Information processing in sensor networks
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Motivated by an availability gap for visual media, where images and videos are uploaded from mobile devices well after they are generated, we explore the selective, timely retrieval of media content from a collection of mobile devices. We envision this capability being driven by similarity-based queries posed to a cloud search front-end, which in turn dynamically retrieves media objects from mobile devices that best match the respective queries within a given time limit. Building upon a crowd-sensing framework, we have designed and implemented a system called MediaScope that provides this capability. MediaScope is an extensible framework that supports nearest-neighbor and other geometric queries on the feature space (e.g. clusters, spanners), and contains novel retrieval algorithms that attempt to maximize the retrieval of relevant information. From experiments on a prototype, MediaScope is shown to achieve near-optimal query completeness and low to moderate overhead on mobile devices.