Bounds on information combining

  • Authors:
  • I. Land;S. Huettinger;P. A. Hoeher;J. B. Huber

  • Affiliations:
  • Inf. & Coding Theor. Lab., Univ. of Kiel, Germany;-;-;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2005

Quantified Score

Hi-index 754.96

Visualization

Abstract

When the same data sequence is transmitted over two independent channels, or when a data sequence is transmitted twice but independently over the same channel, the independent observations can be combined at the receiver side. From an information-theory point of view, the overall mutual information between the data sequence and the received sequences represents a combination of the mutual information of the two channels. This concept is termed information combining. A lower bound and an upper bound on the combined information is presented, and it is proved that these bounds are tight. Furthermore, this principle is extended to the computation of extrinsic information on single code bits for a repetition code and for a single parity-check code of length three, respectively. For illustration of the concept and the bounds on information combining, two applications are considered. First, bounds on the information processing characteristic (IPC) of a parallel concatenated code are derived from its extrinsic information transfer (EXIT) chart. Second, bounds on the EXIT chart for an outer repetition code and for an outer single parity-check code of a serially concatenated coding scheme are computed.