Privacy amplification by public discussion
SIAM Journal on Computing - Special issue on cryptography
Pseudo-random generation from one-way functions
STOC '89 Proceedings of the twenty-first annual ACM symposium on Theory of computing
Elements of information theory
Elements of information theory
Secret-key reconciliation by public discussion
EUROCRYPT '93 Workshop on the theory and application of cryptographic techniques on Advances in cryptology
Journal of Computer and System Sciences
A Pseudorandom Generator from any One-way Function
SIAM Journal on Computing
Pairwise Independence and Derandomization
Pairwise Independence and Derandomization
ACM SIGACT News - A special issue on cryptography
Smooth entropy and Rényi entropy
EUROCRYPT'97 Proceedings of the 16th annual international conference on Theory and application of cryptographic techniques
Universally composable privacy amplification against quantum adversaries
TCC'05 Proceedings of the Second international conference on Theory of Cryptography
Generalized privacy amplification
IEEE Transactions on Information Theory - Part 2
SODA '07 Proceedings of the eighteenth annual ACM-SIAM symposium on Discrete algorithms
Conditional Computational Entropy, or Toward Separating Pseudoentropy from Compressibility
EUROCRYPT '07 Proceedings of the 26th annual international conference on Advances in Cryptology
Composable Security in the Bounded-Quantum-Storage Model
ICALP '08 Proceedings of the 35th international colloquium on Automata, Languages and Programming, Part II
Oblivious Transfer from Weak Noisy Channels
TCC '09 Proceedings of the 6th Theory of Cryptography Conference on Theory of Cryptography
Smooth entropies and the quantum information spectrum
IEEE Transactions on Information Theory
Min- and max-relative entropies and a new entanglement monotone
IEEE Transactions on Information Theory
On the Power of Two-Party Quantum Cryptography
ASIACRYPT '09 Proceedings of the 15th International Conference on the Theory and Application of Cryptology and Information Security: Advances in Cryptology
Secure identification and QKD in the bounded-quantum-storage model
CRYPTO'07 Proceedings of the 27th annual international cryptology conference on Advances in cryptology
A tight high-order entropic quantum uncertainty relation with applications
CRYPTO'07 Proceedings of the 27th annual international cryptology conference on Advances in cryptology
Randomness extraction via δ-biased masking in the presence of a quantum attacker
TCC'08 Proceedings of the 5th conference on Theory of cryptography
Duality between smooth min- and max-entropies
IEEE Transactions on Information Theory
Cryptographic extraction and key derivation: the HKDF scheme
CRYPTO'10 Proceedings of the 30th annual conference on Advances in cryptology
Some notions of entropy for cryptography
ICITS'11 Proceedings of the 5th international conference on Information theoretic security
Robust fuzzy extractors and authenticated key agreement from close secrets
CRYPTO'06 Proceedings of the 26th annual international conference on Advances in Cryptology
Hi-index | 0.19 |
Shannon entropy is a useful and important measure in information processing, for instance, data compression or randomness extraction, under the assumption—which can typically safely be made in communication theory—that a certain random experiment is independently repeated many times. In cryptography, however, where a system’s working has to be proven with respect to a malicious adversary, this assumption usually translates to a restriction on the latter’s knowledge or behavior and is generally not satisfied. An example is quantum key agreement, where the adversary can attack each particle sent through the quantum channel differently or even carry out coherent attacks, combining a number of particles together. In information-theoretic key agreement, the central functionalities of information reconciliation and privacy amplification have, therefore, been extensively studied in the scenario of general distributions: Partial solutions have been given, but the obtained bounds are arbitrarily far from tight, and a full analysis appeared to be rather involved to do. We show that, actually, the general case is not more difficult than the scenario of independent repetitions—in fact, given our new point of view, even simpler. When one analyzes the possible efficiency of data compression and randomness extraction in the case of independent repetitions, then Shannon entropy H is the answer. We show that H can, in these two contexts, be generalized to two very simple quantities—$H_0^\epsilon$ and $H_\infty^\epsilon$, called smooth Rényi entropies—which are tight bounds for data compression (hence, information reconciliation) and randomness extraction (privacy amplification), respectively. It is shown that the two new quantities, and related notions, do not only extend Shannon entropy in the described contexts, but they also share central properties of the latter such as the chain rule as well as sub-additivity and monotonicity.