On the distribution of the number of roots of polynomials and explicit weak designs
Random Structures & Algorithms
A note on the decoding complexity of error-correcting codes
Information Processing Letters
Extractors Using Hardness Amplification
APPROX '09 / RANDOM '09 Proceedings of the 12th International Workshop and 13th International Workshop on Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques
Hi-index | 0.00 |
We prove (mostly tight) space lower bounds for "streaming" (or "on-line") computations of four fundamental combinatorial objects: error-correcting codes, universal hash functions, extractors, and dispersers. Streaming computations for these objects are motivated algorithmically by massive data set applications and complexity-theoretically by pseudorandomness and derandomization for space-bounded probabilistic algorithms.Our results reveal a surprising separation of extractors and dispersers in terms of the space required to compute them in the streaming model. While online extractors require space linear in their output length, we construct dispersers that are computable online with exponentially less space. We also present several explicit constructions of online extractors that match the lower bound.We show that online universal and almost-universal hash functions require space linear in their output length (this bound was known previously only for "pure" universal hash functions \cite{MNT93,BTY94}).Finally, we show that both online encoding and online decoding of error-correcting codes require space proportional to the product of the length of the encoded message and the code's relative minimum distance. Block encoding trivially matches the lower bounds for constant rate codes.