LP Decoding Corrects a Constant Fraction of Errors

  • Authors:
  • J. Feldman;T. Malkin;R. A. Servedio;C. Stein;M. J. Wainwright

  • Affiliations:
  • Dept. of Ind. Eng. & Oper. Res., Columbia Univ., New York, NY;-;-;-;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2007

Quantified Score

Hi-index 755.32

Visualization

Abstract

We show that for low-density parity-check (LDPC) codes whose Tanner graphs have sufficient expansion, the linear programming (LP) decoder of Feldman, Karger, and Wainwright can correct a constant fraction of errors. A random graph will have sufficient expansion with high probability, and recent work shows that such graphs can be constructed efficiently. A key element of our method is the use of a dual witness: a zero-valued dual solution to the decoding linear program whose existence proves decoding success. We show that as long as no more than a certain constant fraction of the bits are flipped by the channel, we can find a dual witness. This new method can be used for proving bounds on the performance of any LP decoder, even in a probabilistic setting. Our result implies that the word error rate of the LP decoder decreases exponentially in the code length under the binary-symmetric channel (BSC). This is the first such error bound for LDPC codes using an analysis based on "pseudocodewords." Recent work by Koetter and Vontobel shows that LP decoding and min-sum decoding of LDPC codes are closely related by the "graph cover" structure of their pseudocodewords; in their terminology, our result implies that that there exist families of LDPC codes where the minimum BSC pseudoweight grows linearly in the block length