The minimum distance of turbo-like codes

  • Authors:
  • Louay Bazzi;Mohammad Mahdian;Daniel A. Spielman

  • Affiliations:
  • Department of Electrical and Computer Engineering, American University of Beirut, Beirut, Lebanon;Yahoo! Research, Santa Clara, CA;Department of Computer Science, Yale University, New Haven, CT

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2009

Quantified Score

Hi-index 754.90

Visualization

Abstract

Worst-case upper bounds are derived on the minimum distance of parallel concatenated turbo codes, serially concatenated convolutional codes, repeat-accumulate codes, repeat-convolute codes, and generalizations of these codes obtained by allowing nonlinear and large-memory constituent codes. It is shown that parallel-concatenated turbo codes and repeat-convolute codes with sub-linear memory are asymptotically bad. It is also shown that depth-two serially concatenated codes with constant-memory outer codes and sublinear-memory inner codes are asymptotically bad. Most of these upper bounds hold even when the convolutional encoders are replaced by general finite-state automata encoders. In contrast, it is proven that depth-three serially concatenated codes obtained by concatenating a repetition code with two accumulator codes through random permutations can be asymptotically good.