On the relation between differential privacy and quantitative information flow
ICALP'11 Proceedings of the 38th international conference on Automata, languages and programming - Volume Part II
Quantitative information flow and applications to differential privacy
Foundations of security analysis and design VI
Probabilistic relational reasoning for differential privacy
POPL '12 Proceedings of the 39th annual ACM SIGPLAN-SIGACT symposium on Principles of programming languages
Differential privacy: on the trade-off between utility and information leakage
FAST'11 Proceedings of the 8th international conference on Formal Aspects of Security and Trust
Min-Entropy leakage of channels in cascade
FAST'11 Proceedings of the 8th international conference on Formal Aspects of Security and Trust
Worst- and average-case privacy breaches in randomization mechanisms
TCS'12 Proceedings of the 7th IFIP TC 1/WG 202 international conference on Theoretical Computer Science
Linear dependent types for differential privacy
POPL '13 Proceedings of the 40th annual ACM SIGPLAN-SIGACT symposium on Principles of programming languages
A differentially private mechanism of optimal utility for a region of priors
POST'13 Proceedings of the Second international conference on Principles of Security and Trust
Information-Theoretic foundations of differential privacy
FPS'12 Proceedings of the 5th international conference on Foundations and Practice of Security
πBox: a platform for privacy-preserving apps
nsdi'13 Proceedings of the 10th USENIX conference on Networked Systems Design and Implementation
Probabilistic Relational Reasoning for Differential Privacy
ACM Transactions on Programming Languages and Systems (TOPLAS)
Hi-index | 0.00 |
There are two active and independent lines of research that aim at quantifying the amount of information that is disclosed by computing on confidential data. Each line of research has developed its own notion of confidentiality: on the one hand, differential privacy is the emerging consensus guarantee used for privacy-preserving data analysis. On the other hand, information-theoretic notions of leakage are used for characterizing the confidentiality properties of programs in language-based settings. The purpose of this article is to establish formal connections between both notions of confidentiality, and to compare them in terms of the security guarantees they deliver. We obtain the following results. First, we establish upper bounds for the leakage of every eps-differentially private mechanism in terms of eps and the size of the mechanism's input domain. We achieve this by identifying and leveraging a connection to coding theory. Second, we construct a class of eps-differentially private channels whose leakage grows with the size of their input domains. Using these channels, we show that there cannot be domain-size-independent bounds for the leakage of all eps-differentially private mechanisms. Moreover, we perform an empirical evaluation that shows that the leakage of these channels almost matches our theoretical upper bounds, demonstrating the accuracy of these bounds. Finally, we show that the question of providing optimal upper bounds for the leakage of eps-differentially private mechanisms in terms of rational functions of eps is in fact decidable.