Information Theory and Reliable Communication
Information Theory and Reliable Communication
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Quantitative Information Flow, Relations and Polymorphic Types
Journal of Logic and Computation
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
An information-theoretic model for adaptive side-channel attacks
Proceedings of the 14th ACM conference on Computer and communications security
On the Foundations of Quantitative Information Flow
FOSSACS '09 Proceedings of the 12th International Conference on Foundations of Software Science and Computational Structures: Held as Part of the Joint European Conferences on Theory and Practice of Software, ETAPS 2009
Quantitative Notions of Leakage for One-try Attacks
Electronic Notes in Theoretical Computer Science (ENTCS)
Vulnerability Bounds and Leakage Resilience of Blinded Cryptography under Timing Attacks
CSF '10 Proceedings of the 2010 23rd IEEE Computer Security Foundations Symposium
Reconciling Belief and Vulnerability in Information Flow
SP '10 Proceedings of the 2010 IEEE Symposium on Security and Privacy
A firm foundation for private data analysis
Communications of the ACM
Probabilistic Information Flow
LICS '10 Proceedings of the 2010 25th Annual IEEE Symposium on Logic in Computer Science
Information-Theoretic Bounds for Differentially Private Mechanisms
CSF '11 Proceedings of the 2011 IEEE 24th Computer Security Foundations Symposium
Computing the leakage of information-hiding systems
TACAS'10 Proceedings of the 16th international conference on Tools and Algorithms for the Construction and Analysis of Systems
On the capacity of a cascade of channels
IEEE Transactions on Information Theory
Differential privacy: on the trade-off between utility and information leakage
FAST'11 Proceedings of the 8th international conference on Formal Aspects of Security and Trust
Hi-index | 0.00 |
Theories of quantitative information flow offer an attractive framework for analyzing confidentiality in practical systems, which often cannot avoid "small" leaks of confidential information. Recently there has been growing interest in the theory of min-entropy leakage, which measures uncertainty based on a random variable's vulnerability to being guessed in one try by an adversary. Here we contribute to this theory by studying the min-entropy leakage of systems formed by cascading two channels together, using the output of the first channel as the input to the second channel. After considering the semantics of cascading carefully and exposing some technical subtleties, we prove that the min-entropy leakage of a cascade of two channels cannot exceed the leakage of the first channel; this result is a min-entropy analogue of the classic data-processing inequality. We show however that a comparable bound does not hold for the second channel. We then consider the min-capacity, or maximum leakage over all a priori distributions, showing that the min-capacity of a cascade of two channels cannot exceed the min-capacity of either channel.