Quantitative Information Flow, Relations and Polymorphic Types
Journal of Logic and Computation
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Assessing security threats of looping constructs
Proceedings of the 34th annual ACM SIGPLAN-SIGACT symposium on Principles of programming languages
Smooth sensitivity and sampling in private data analysis
Proceedings of the thirty-ninth annual ACM symposium on Theory of computing
An information-theoretic model for adaptive side-channel attacks
Proceedings of the 14th ACM conference on Computer and communications security
Lagrange multipliers and maximum information leakage in different observational models
Proceedings of the third ACM SIGPLAN workshop on Programming languages and analysis for security
Fuzzy Extractors: How to Generate Strong Keys from Biometrics and Other Noisy Data
SIAM Journal on Computing
On the Bayes risk in information-hiding protocols
Journal of Computer Security - 20th IEEE Computer Security Foundations Symposium (CSF)
On the Foundations of Quantitative Information Flow
FOSSACS '09 Proceedings of the 12th International Conference on Foundations of Software Science and Computational Structures: Held as Part of the Joint European Conferences on Theory and Practice of Software, ETAPS 2009
Universally utility-maximizing privacy mechanisms
Proceedings of the forty-first annual ACM symposium on Theory of computing
Differential privacy and robust statistics
Proceedings of the forty-first annual ACM symposium on Theory of computing
Quantitative Notions of Leakage for One-try Attacks
Electronic Notes in Theoretical Computer Science (ENTCS)
Quantifying information flow with beliefs
Journal of Computer Security - 18th IEEE Computer Security Foundations Symposium (CSF 18)
Compositional methods for information-hiding
FOSSACS'08/ETAPS'08 Proceedings of the Theory and practice of software, 11th international conference on Foundations of software science and computational structures
A firm foundation for private data analysis
Communications of the ACM
Differential privacy in new settings
SODA '10 Proceedings of the twenty-first annual ACM-SIAM symposium on Discrete Algorithms
Information-Theoretic Bounds for Differentially Private Mechanisms
CSF '11 Proceedings of the 2011 IEEE 24th Computer Security Foundations Symposium
ICALP'06 Proceedings of the 33rd international conference on Automata, Languages and Programming - Volume Part II
Applied quantitative information flow and statistical databases
FAST'09 Proceedings of the 6th international conference on Formal Aspects in Security and Trust
Min-Entropy leakage of channels in cascade
FAST'11 Proceedings of the 8th international conference on Formal Aspects of Security and Trust
Worst- and average-case privacy breaches in randomization mechanisms
TCS'12 Proceedings of the 7th IFIP TC 1/WG 202 international conference on Theoretical Computer Science
A differentially private mechanism of optimal utility for a region of priors
POST'13 Proceedings of the Second international conference on Principles of Security and Trust
Information-Theoretic foundations of differential privacy
FPS'12 Proceedings of the 5th international conference on Foundations and Practice of Security
Hi-index | 0.00 |
Differential privacy is a notion of privacy that has become very popular in the database community. Roughly, the idea is that a randomized query mechanism provides sufficient privacy protection if the ratio between the probabilities that two adjacent datasets give the same answer is bound by eε. In the field of information flow there is a similar concern for controlling information leakage, i.e. limiting the possibility of inferring the secret information from the observables. In recent years, researchers have proposed to quantify the leakage in terms of min-entropy leakage, a concept strictly related to the Bayes risk. In this paper, we show how to model the query system in terms of an information-theoretic channel, and we compare the notion of differential privacy with that of min-entropy leakage. We show that differential privacy implies a bound on the min-entropy leakage, but not vice-versa. Furthermore, we show that our bound is tight. Then, we consider the utility of the randomization mechanism, which represents how close the randomized answers are to the real ones, in average. We show that the notion of differential privacy implies a bound on utility, also tight, and we propose a method that under certain conditions builds an optimal randomization mechanism, i.e. a mechanism which provides the best utility while guaranteeing ε-differential privacy.