Differential privacy under continual observation

  • Authors:
  • Cynthia Dwork;Moni Naor;Toniann Pitassi;Guy N. Rothblum

  • Affiliations:
  • Microsoft Research, Mountain View, CA, USA;Weizmann Institute of Science, Rehovot, Israel;University of Toronto, Toronto, Canada;Princeton University, Princeton, NJ, USA

  • Venue:
  • Proceedings of the forty-second ACM symposium on Theory of computing
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Differential privacy is a recent notion of privacy tailored to privacy-preserving data analysis [11]. Up to this point, research on differentially private data analysis has focused on the setting of a trusted curator holding a large, static, data set; thus every computation is a "one-shot" object: there is no point in computing something twice, since the result will be unchanged, up to any randomness introduced for privacy. However, many applications of data analysis involve repeated computations, either because the entire goal is one of monitoring, e.g., of traffic conditions, search trends, or incidence of influenza, or because the goal is some kind of adaptive optimization, e.g., placement of data to minimize access costs. In these cases, the algorithm must permit continual observation of the system's state. We therefore initiate a study of differential privacy under continual observation. We identify the problem of maintaining a counter in a privacy preserving manner and show its wide applicability to many different problems.