Revealing information while preserving privacy
Proceedings of the twenty-second ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
k-anonymity: a model for protecting privacy
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
Smooth sensitivity and sampling in private data analysis
Proceedings of the thirty-ninth annual ACM symposium on Theory of computing
The price of privacy and the limits of LP decoding
Proceedings of the thirty-ninth annual ACM symposium on Theory of computing
Mechanism Design via Differential Privacy
FOCS '07 Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science
A learning theory approach to non-interactive database privacy
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Robust De-anonymization of Large Sparse Datasets
SP '08 Proceedings of the 2008 IEEE Symposium on Security and Privacy
FOCS '08 Proceedings of the 2008 49th Annual IEEE Symposium on Foundations of Computer Science
Universally utility-maximizing privacy mechanisms
Proceedings of the forty-first annual ACM symposium on Theory of computing
Differential privacy: a survey of results
TAMC'08 Proceedings of the 5th international conference on Theory and applications of models of computation
ICALP'06 Proceedings of the 33rd international conference on Automata, Languages and Programming - Volume Part II
Calibrating noise to sensitivity in private data analysis
TCC'06 Proceedings of the Third conference on Theory of Cryptography
Unconditional differentially private mechanisms for linear queries
STOC '12 Proceedings of the forty-fourth annual ACM symposium on Theory of computing
Integrating historical noisy answers for improving data utility under differential privacy
Proceedings of the 15th International Conference on Extending Database Technology
On optimal differentially private mechanisms for count-range queries
Proceedings of the 16th International Conference on Database Theory
A differentially private mechanism of optimal utility for a region of priors
POST'13 Proceedings of the Second international conference on Principles of Security and Trust
Information preservation in statistical privacy and bayesian estimation of unattributed histograms
Proceedings of the 2013 ACM SIGMOD International Conference on Management of Data
The geometry of differential privacy: the sparse and approximate cases
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
A general framework for privacy preserving data publishing
Knowledge-Based Systems
Hi-index | 0.00 |
A scheme that publishes aggregate information about sensitive data must resolve the trade-off between utility to information consumers and privacy of the database participants. Differential privacy [5] is a well-established definition of privacy--this is a universal guarantee against all attackers, whatever their side-information or intent. Can we have a similar universal guarantee for utility? There are two standard models of utility considered in decision theory: Bayesian and minimax [13]. Ghosh et. al. [8] show that a certain "geometric mechanism" gives optimal utility to all Bayesian information consumers. In this paper, we prove a similar result for minimax information consumers. Our result also works for a wider class of information consumers which includes Bayesian information consumers and subsumes the result from [8]. We model information consumers as minimax (risk-averse) agents, each endowed with a loss-function which models their tolerance to inaccuracies and each possessing some side-information about the query. Further, information consumers are rational in the sense that they actively combine information from the mechanism with their side-information in a way that minimizes their loss. Under this assumption of rational behavior, we show that for every fixed count query, the geometric mechanism is universally optimal for all minimax information consumers. Additionally, our solution makes it possible to release query results, when information consumers are at different levels of privacy, in a collusion-resistant manner.