Revealing information while preserving privacy
Proceedings of the twenty-second ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
An information statistics approach to data stream and communication complexity
Journal of Computer and System Sciences - Special issue on FOCS 2002
The price of privacy and the limits of LP decoding
Proceedings of the thirty-ninth annual ACM symposium on Theory of computing
A learning theory approach to non-interactive database privacy
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
How to compress interactive communication
Proceedings of the forty-second ACM symposium on Theory of computing
On the geometry of differential privacy
Proceedings of the forty-second ACM symposium on Theory of computing
Proceedings of the forty-second ACM symposium on Theory of computing
The Limits of Two-Party Differential Privacy
FOCS '10 Proceedings of the 2010 IEEE 51st Annual Symposium on Foundations of Computer Science
Our data, ourselves: privacy via distributed noise generation
EUROCRYPT'06 Proceedings of the 24th annual international conference on The Theory and Applications of Cryptographic Techniques
Calibrating noise to sensitivity in private data analysis
TCC'06 Proceedings of the Third conference on Theory of Cryptography
Is privacy compatible with truthfulness?
Proceedings of the 4th conference on Innovations in Theoretical Computer Science
Non-interactive differential privacy: a survey
Proceedings of the First International Workshop on Open Data
The geometry of differential privacy: the sparse and approximate cases
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Faster private release of marginals on small databases
Proceedings of the 5th conference on Innovations in theoretical computer science
Mechanism design in large games: incentives and privacy
Proceedings of the 5th conference on Innovations in theoretical computer science
Redrawing the boundaries on purchasing data from privacy-sensitive individuals
Proceedings of the 5th conference on Innovations in theoretical computer science
Hi-index | 0.00 |
This paper is about private data analysis, in which a trusted curator holding a confidential database responds to real vector-valued queries. A common approach to ensuring privacy for the database elements is to add appropriately generated random noise to the answers, releasing only these noisy responses. A line of study initiated in [7] examines the amount of distortion needed to prevent privacy violations of various kinds. The results in the literature vary according to several parameters, including the size of the database, the size of the universe from which data elements are drawn, the "amount" of privacy desired, and for the purposes of the current work, the arity of the query. In this paper we sharpen and unify these bounds. Our foremost result combines the techniques of Hardt and Talwar [11] and McGregor et al. [13] to obtain linear lower bounds on distortion when providing differential privacy for a (contrived) class of low-sensitivity queries. (A query has low sensitivity if the data of a single individual has small effect on the answer.) Several structural results follow as immediate corollaries: — We separate so-called counting queries from arbitrary low-sensitivity queries, proving the latter requires more noise, or distortion, than does the former; — We separate (ε,0)-differential privacy from its well-studied relaxation (ε,δ)-differential privacy, even when δ∈2−o(n) is negligible in the size n of the database, proving the latter requires less distortion than the former; — We demonstrate that (ε,δ)-differential privacy is much weaker than (ε,0)-differential privacy in terms of mutual information of the transcript of the mechanism with the database, even when δ∈2−o(n) is negligible in the size n of the database. We also simplify the lower bounds on noise for counting queries in [11] and also make them unconditional. Further, we use a characterization of (ε,δ) differential privacy from [13] to obtain lower bounds on the distortion needed to ensure (ε,δ)-differential privacy for ε,δ0. We next revisit the LP decoding argument of [10] and combine it with a recent result of Rudelson [15] to improve on a result of Kasiviswanathan et al. [12] on noise lower bounds for privately releasing ℓ-way marginals.