Maximizing Privacy under Data Distortion Constraints in Noise Perturbation Methods

  • Authors:
  • Yaron Rachlin;Katharina Probst;Rayid Ghani

  • Affiliations:
  • Accenture Technology Labs, Chicago, USA;Google Inc., Atlanta, USA;Accenture Technology Labs, Chicago, USA

  • Venue:
  • Privacy, Security, and Trust in KDD
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper introduces the `guessing anonymity,' a definition of privacy for noise perturbation methods. This definition captures the difficulty of linking identity to a sanitized record using publicly available information. Importantly, this definition leads to analytical expressions that bound data privacy as a function of the noise perturbation parameters. Using these bounds, we can formulate optimization problems to describe the feasible tradeoffs between data distortion and privacy, without exhaustively searching the noise parameter space. This work addresses an important shortcoming of noise perturbation methods, by providing them with an intuitive definition of privacy analogous to the definition used in k-anonymity, and an analytical means for selecting parameters to achieve a desired level of privacy. At the same time, our work maintains the appealing aspects of noise perturbation methods, which have made them popular both in practice and as a subject of academic research.