Decoding of Reed Solomon codes beyond the error-correction bound
Journal of Complexity
Linear time erasure codes with nearly optimal recovery
FOCS '95 Proceedings of the 36th Annual Symposium on Foundations of Computer Science
Expander-Based Constructions of Efficiently Decodable Codes
FOCS '01 Proceedings of the 42nd IEEE symposium on Foundations of Computer Science
Computationally efficient error-correcting codes and holographic proofs
Computationally efficient error-correcting codes and holographic proofs
List decoding of error-correcting codes
List decoding of error-correcting codes
IEEE Transactions on Information Theory - Part 1
Linear-time encodable and decodable error-correcting codes
IEEE Transactions on Information Theory - Part 1
List decoding of algebraic-geometric codes
IEEE Transactions on Information Theory
Improved decoding of Reed-Solomon and algebraic-geometry codes
IEEE Transactions on Information Theory
Efficient decoding of Reed-Solomon codes beyond half the minimum distance
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Error exponents of expander codes
IEEE Transactions on Information Theory
Linear time encodable and list decodable codes
Proceedings of the thirty-fifth annual ACM symposium on Theory of computing
Reconstructing curves in three (and higher) dimensional space from noisy data
Proceedings of the thirty-fifth annual ACM symposium on Theory of computing
Better extractors for better codes?
STOC '04 Proceedings of the thirty-sixth annual ACM symposium on Theory of computing
Guest column: error-correcting codes and expander graphs
ACM SIGACT News
SODA '05 Proceedings of the sixteenth annual ACM-SIAM symposium on Discrete algorithms
Correcting Errors Beyond the Guruswami-Sudan Radius in Polynomial Time
FOCS '05 Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science
Algorithmic results in list decoding
Foundations and Trends® in Theoretical Computer Science
Linear time decoding of regular expander codes
Proceedings of the 3rd Innovations in Theoretical Computer Science Conference
Improving the alphabet-size in high noise, almost optimal rate list decodable codes
STACS'05 Proceedings of the 22nd annual conference on Theoretical Aspects of Computer Science
Hardness amplification via space-efficient direct products
LATIN'06 Proceedings of the 7th Latin American conference on Theoretical Informatics
Linear-time decoding of regular expander codes
ACM Transactions on Computation Theory (TOCT) - Special issue on innovations in theoretical computer science 2012
Hi-index | 0.00 |
We present an explicit construction of linear-time encodable and decodable codes of rate r which can correct a fraction (1&mdash:r&egr;)/2 of errors over an alphabet of constant size depending only on &egr;, for every 0 r 0. The error-correction performance of these codes is optimal as seen by the Singleton bound (these are "near-MDS" codes). Such near-MDS linear-time codes were known for the decoding from erasures [2]; our construction generalizes this to handle errors as well. Concatenating these codes with good, constant-sized binary codes gives a construction of linear-time binary codes which meet the so-called "Zyablov bound". In a nutshell, our results match the performance of the previously known explicit constructions of codes that had polynomial time encoding and decoding, but in addition have linear time encoding and decoding algorithms.We also obtain some results for list decoding targeted at the situation when the fraction of errors is very large, namely (1—&egr;) for an arbitrarily small constant &egr; 0. The previously known constructions of such codes of good rate over constant-sized alphabets either used algebraic-geometric codes and thus suffered from complicated constructions and slow decoding, or as in the recent work of the authors [9], had fast encoding/decoding, but suffered from an alphabet size that was exponential in 1/&egr;. We present two constructions of such codes with rate close to &OHgr;(&egr;2) over an alphabet of size quasi-polynomial in 1/&egr;. One of the constructions, at the expense of a slight worsening of the rate, can achieve an alphabet size which is polynomial in 1/&egr;. It also yields constructions of codes for list decoding from erasures which achieve new trade-offs. In particular, we construct codes of rate close to the optimal &OHgr;(&egr;) rate which can be efficiently list decoded from a fraction (1—&egr;) of erasures.