Bit-interleaved coded modulation revisited: a mismatched decoding perspective

  • Authors:
  • Alfonso Martinez;Albert Guillén i Fàbregas;Giuseppe Caire;Frans M. J. Willems

  • Affiliations:
  • Centrum voor Wiskunde en Informatica, Amsterdam, The Netherlands;Department of Engineering, University of Cambridge, Cambridge, UK;Electrical Engineering Department, University of Southern California, Los Angeles, CA;Department of Electrical Engineering, Technische Universiteit Eindhoven, Eindhoven, The Netherlands

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2009

Quantified Score

Hi-index 754.84

Visualization

Abstract

We revisit the information-theoretic analysis of bit-interleaved coded modulation (BICM) by modeling the BICM decoder as a mismatched decoder. The mismatched decoding model is well defined for finite, yet arbitrary, block lengths, and naturally captures the channel memory among the bits belonging to the same symbol. We give two independent proofs of the achievability of the BICM capacity calculated by Caire et al., where BICM was modeled as a set of independent parallel binary-input channels whose output is the bitwise log-likelihood ratio. Our first achievability proof uses typical sequences, and shows that due to the random coding construction, the interleaver is not required. The second proof is based on the random coding error exponents with mismatched decoding, where the largest achievable rate is the generalized mutual information. Moreover, the generalized mutual information of the mismatched decoder coincides with the infinite-interleaver BICM capacity. We show that the error exponent--and hence the cutoff rate--of the BICM mismatched decoder is upper-bounded by that of coded modulation and may thus be lower than in the infinite-interleaved model; for binary reflected Gray mapping in Gaussian channels the loss in error exponent is small. We also consider the mutual information appearing in the analysis of iterative decoding of BICM with extrinsic information transfer (EXIT) charts: if the symbol metric has knowledge of the transmitted symbol, EXIT mutual information admits a representation as a pseudo-generalized mutual information, which is in general not achievable. A different symbol decoding metric, for which the extrinsic side information refers to the hypothesized symbol, induces a generalized mutual information lower than the coded modulation capacity. In this case, perfect extrinsic side information turns the mismatched-decoder error exponent into that of coded modulation.