Local Differential Perturbations: Location Privacy under Approximate Knowledge Attackers

  • Authors:
  • Rinku Dewri

  • Affiliations:
  • University of Denver, Denver

  • Venue:
  • IEEE Transactions on Mobile Computing
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Location privacy research has received wide attention in the past few years owing to the growing popularity of location-based applications, and the skepticism thereof on the collection of location information. A large section of this research is directed toward mechanisms based on location obfuscation enforced using cloaking regions. The primary motivation for this engagement comes from the relatively well-researched area of database privacy. Researchers in this sibling domain have indicated multiple times that any notion of privacy is incomplete without explicit statements on the capabilities of an adversary. As a result, we have started to see some efforts to categorize the various forms of background knowledge that an adversary may possess in the context of location privacy. Along this line, we consider some preliminary forms of attacker knowledge, and explore what implication does a certain form of knowledge has on location privacy. Continuing on, we extend our insights to a form of adversarial knowledge related to the geographic uncertainty that the adversary has in correctly locating a user. We empirically demonstrate that the use of cloaking regions can adversely impact the preservation of privacy in the presence of such approximate location knowledge, and demonstrate how perturbation-based mechanisms can instead provide a well-balanced tradeoff between privacy and service accuracy.