Formal Verification of Differential Privacy for Interactive Systems (Extended Abstract)

  • Authors:
  • Michael Carl Tschantz;Dilsun Kaynar;Anupam Datta

  • Affiliations:
  • Carnegie Mellon University, Pittsburgh, USA;Carnegie Mellon University, Pittsburgh, USA;Carnegie Mellon University, Pittsburgh, USA

  • Venue:
  • Electronic Notes in Theoretical Computer Science (ENTCS)
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Differential privacy is a promising approach to privacy preserving data analysis with a well-developed theory for functions. Despite recent work on implementing systems that aim to provide differential privacy, the problem of formally verifying that these systems have differential privacy has not been adequately addressed. We develop a formal probabilistic automaton model of differential privacy for systems by adapting prior work on differential privacy for functions. We present the first sound verification technique for proving differential privacy of interactive systems. The technique is based on a form of probabilistic bisimulation relation. The novelty lies in the way we track quantitative privacy leakage bounds using a relation family instead of a single relation. We illustrate the proof technique on a representative automaton motivated by PINQ, an implemented system that is intended to provide differential privacy. Surprisingly, our analysis yields a privacy leakage bound of (2t@?@e) rather than (t@?@e) when @e-differentially private functions are called t times. The extra leakage arises from accounting for bounded memory constraints of real computers.