Foundations and Trends in Communications and Information Theory
Doubly generalized LDPC codes over the AWGN channel
IEEE Transactions on Communications
Identical-capacity channel decomposition for design of universal LDPC codes
IEEE Transactions on Communications
A class of transformations that polarize binary-input memoryless channels
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 3
The generalized area theorem and some of its consequences
IEEE Transactions on Information Theory
Information combining for relay networks
ICC'09 Proceedings of the 2009 IEEE international conference on Communications
Polar codes: characterization of exponent, bounds, and constructions
IEEE Transactions on Information Theory
Hi-index | 754.96 |
When the same data sequence is transmitted over two independent channels, or when a data sequence is transmitted twice but independently over the same channel, the independent observations can be combined at the receiver side. From an information-theory point of view, the overall mutual information between the data sequence and the received sequences represents a combination of the mutual information of the two channels. This concept is termed information combining. A lower bound and an upper bound on the combined information is presented, and it is proved that these bounds are tight. Furthermore, this principle is extended to the computation of extrinsic information on single code bits for a repetition code and for a single parity-check code of length three, respectively. For illustration of the concept and the bounds on information combining, two applications are considered. First, bounds on the information processing characteristic (IPC) of a parallel concatenated code are derived from its extrinsic information transfer (EXIT) chart. Second, bounds on the EXIT chart for an outer repetition code and for an outer single parity-check code of a serially concatenated coding scheme are computed.