Bounds for Dispersers, Extractors, and Depth-Two Superconcentrators
SIAM Journal on Discrete Mathematics
Extractors and pseudorandom generators
Journal of the ACM (JACM)
Code bounds for multiple packings over a nonbinary finite alphabet
Problems of Information Transmission
On the convexity of one coding-theory function
Problems of Information Transmission
Limits to List Decoding Random Codes
COCOON '09 Proceedings of the 15th Annual International Conference on Computing and Combinatorics
A lower bound on list size for list decoding
APPROX'05/RANDOM'05 Proceedings of the 8th international workshop on Approximation, Randomization and Combinatorial Optimization Problems, and Proceedings of the 9th international conference on Randamization and Computation: algorithms and techniques
Combinatorial bounds for list decoding
IEEE Transactions on Information Theory
List decoding from erasures: bounds and code constructions
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
On the list decodability of random linear codes with large error rates
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Hi-index | 754.84 |
A q-ary error-correcting code C ⊆ {1,2,...,q}n is said to be list decodable to radius ρ with list size L if every Hamming ball of radius ρ contains at most L codewords of C. We prove that in order for a q-ary code to be list-decodable up to radius (1 - 1/q)(1 - ε)n, we must have L = Ω(1/ε2). Specifically, we prove that there exists a constant cq 0 and a function fq such that for small enough ε 0, if C is list-decodable to radius (1 - 1/q)(1 - ε)n, with list size cq/ε2, then C has at most fq(ε) codewords, independent of n. This result is asymptotically tight (treating q as a constant), since such codes with an exponential (in n) number of codewords are known for list size L = O(1/ε2). A result similar to ours is implicit in Blinovsky (Problems of Information Transmission, 1986) for the binary (q = 2) case. Our proof is simpler and works for all alphabet sizes, and provides more intuition for why the lower bound arises.