A data distortion by probability distribution
ACM Transactions on Database Systems (TODS)
Security-control methods for statistical databases: a comparative study
ACM Computing Surveys (CSUR)
Secure statistical databases with random sample queries
ACM Transactions on Database Systems (TODS)
Revealing information while preserving privacy
Proceedings of the twenty-second ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
The price of privacy and the limits of LP decoding
Proceedings of the thirty-ninth annual ACM symposium on Theory of computing
New Efficient Attacks on Statistical Disclosure Control Mechanisms
CRYPTO 2008 Proceedings of the 28th Annual conference on Cryptology: Advances in Cryptology
Differential privacy: a survey of results
TAMC'08 Proceedings of the 5th international conference on Theory and applications of models of computation
On the geometry of differential privacy
Proceedings of the forty-second ACM symposium on Theory of computing
A firm foundation for private data analysis
Communications of the ACM
Proceedings of the 2011 ACM SIGMOD International Conference on Management of data
Evaluating Laplace Noise Addition to Satisfy Differential Privacy for Numeric Data
Transactions on Data Privacy
ICALP'06 Proceedings of the 33rd international conference on Automata, Languages and Programming - Volume Part II
Calibrating noise to sensitivity in private data analysis
TCC'06 Proceedings of the Third conference on Theory of Cryptography
Hi-index | 0.00 |
We prove that, with respect to a database query response privacy mechanism employing output perturbation with i.i.d. random noise addition, an adversary can, allowed a sufficiently large number of queries, exactly determine all records in an n-record database up to overwhelming probability of success, and establish corresponding quantitative confidence bounds for the attack success probability. These confidence bounds do not depend on the cardinality |D| of the data domain D⊂R, where the database ${\mathcal{D}}$ is a member of the set Dn, and they even admit some unbounded data domains D of (countably) infinite cardinality. Within the context of differential privacy, we show that our results also imply a lower bound on the variance of independent, Laplace-distributed noise that can be added to user queries if database privacy is to be preserved. Our results do not require the additive noise to be bounded by $o(\sqrt{n})$ as assumed in Dinur & Nissim (2003) and Dwork & Yekhanin (2008), which, on the other hand, do admit correlated noise.