Limits on the provable consequences of one-way permutations
STOC '89 Proceedings of the twenty-first annual ACM symposium on Theory of computing
Practical privacy: the SuLQ framework
Proceedings of the twenty-fourth ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
Bounds on the Efficiency of Generic Cryptographic Constructions
SIAM Journal on Computing
Smooth sensitivity and sampling in private data analysis
Proceedings of the thirty-ninth annual ACM symposium on Theory of computing
A learning theory approach to non-interactive database privacy
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
FOCS '08 Proceedings of the 2008 49th Annual IEEE Symposium on Foundations of Computer Science
Computational Differential Privacy
CRYPTO '09 Proceedings of the 29th Annual International Cryptology Conference on Advances in Cryptology
The Limits of Two-Party Differential Privacy
FOCS '10 Proceedings of the 2010 IEEE 51st Annual Symposium on Foundations of Computer Science
ICALP'06 Proceedings of the 33rd international conference on Automata, Languages and Programming - Volume Part II
Our data, ourselves: privacy via distributed noise generation
EUROCRYPT'06 Proceedings of the 24th annual international conference on The Theory and Applications of Cryptographic Techniques
Calibrating noise to sensitivity in private data analysis
TCC'06 Proceedings of the Third conference on Theory of Cryptography
Hi-index | 0.00 |
Differential privacy is a well established definition guaranteeing that queries to a database do not reveal "too much" information about specific individuals who have contributed to the database. The standard definition of differential privacy is information theoretic in nature, but it is natural to consider computational relaxations and to explore what can be achieved with respect to such notions. Mironov et al. (Crypto 2009) and McGregor et al. (FOCS 2010) recently introduced and studied several variants of computational differential privacy, and show that in the two-party setting (where data is split between two parties) these relaxations can offer significant advantages. Left open by prior work was the extent, if any, to which computational differential privacy can help in the usual client/server setting where the entire database resides at the server, and the client poses queries on this data. We show, for queries with output in Rn (for constant n) and with respect to a large class of utilities, that any computationally private mechanism can be converted to a statistically private mechanism that is equally efficient and achieves roughly the same utility.