How to reduce your enemy's information
Lecture notes in computer sciences; 218 on Advances in cryptology---CRYPTO 85
Unbiased bits from sources of weak randomness and probabilistic communication complexity
SIAM Journal on Computing - Special issue on cryptography
Pseudo-random generation from one-way functions
STOC '89 Proceedings of the twenty-first annual ACM symposium on Theory of computing
Journal of Computer and System Sciences
The art of computer programming, volume 3: (2nd ed.) sorting and searching
The art of computer programming, volume 3: (2nd ed.) sorting and searching
Extracting randomness: a survey and new constructions
Journal of Computer and System Sciences
Bounds for Dispersers, Extractors, and Depth-Two Superconcentrators
SIAM Journal on Discrete Mathematics
Data streams: algorithms and applications
SODA '03 Proceedings of the fourteenth annual ACM-SIAM symposium on Discrete algorithms
The unified theory of pseudorandomness: guest column
ACM SIGACT News
Why simple hash functions work: exploiting the entropy in a data stream
Proceedings of the nineteenth annual ACM-SIAM symposium on Discrete algorithms
Tight Bounds for Hashing Block Sources
APPROX '08 / RANDOM '08 Proceedings of the 11th international workshop, APPROX 2008, and 12th international workshop, RANDOM 2008 on Approximation, Randomization and Combinatorial Optimization: Algorithms and Techniques
Tight Bounds for Hashing Block Sources
APPROX '08 / RANDOM '08 Proceedings of the 11th international workshop, APPROX 2008, and 12th international workshop, RANDOM 2008 on Approximation, Randomization and Combinatorial Optimization: Algorithms and Techniques
Distinguishing distributions using Chernoff information
ProvSec'10 Proceedings of the 4th international conference on Provable security
Hi-index | 0.00 |
It is known that if a 2-universal hash function His applied to elements of a block source(X1,...,XT), where each item Xihas enough min-entropy conditioned on the previous items, then the output distribution (H,H(X1),...,H(XT)) will be "close" to the uniform distribution. We provide improved bounds on how much min-entropy per item is required for this to hold, both when we ask that the output be close to uniform in statistical distance and when we only ask that it be statistically close to a distribution with small collision probability. In both cases, we reduce the dependence of the min-entropy on the number Tof items from 2logTin previous work to logT, which we show to be optimal. This leads to corresponding improvements to the recent results of Mitzenmacher and Vadhan (SODA `08) on the analysis of hashing-based algorithms and data structures when the data items come from a block source.