Security of random output perturbation for statistical databases

  • Authors:
  • Daniel Z. Zanger

  • Affiliations:
  • SRI International, Arlington, Virginia

  • Venue:
  • PSD'12 Proceedings of the 2012 international conference on Privacy in Statistical Databases
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We prove that, with respect to a database query response privacy mechanism employing output perturbation with i.i.d. random noise addition, an adversary can, allowed a sufficiently large number of queries, exactly determine all records in an n-record database up to overwhelming probability of success, and establish corresponding quantitative confidence bounds for the attack success probability. These confidence bounds do not depend on the cardinality |D| of the data domain D⊂R, where the database ${\mathcal{D}}$ is a member of the set Dn, and they even admit some unbounded data domains D of (countably) infinite cardinality. Within the context of differential privacy, we show that our results also imply a lower bound on the variance of independent, Laplace-distributed noise that can be added to user queries if database privacy is to be preserved. Our results do not require the additive noise to be bounded by $o(\sqrt{n})$ as assumed in Dinur & Nissim (2003) and Dwork & Yekhanin (2008), which, on the other hand, do admit correlated noise.