Elements of information theory
Elements of information theory
Information Theory and Reliable Communication
Information Theory and Reliable Communication
Design of repeat-accumulate codes for iterative detection and decoding
IEEE Transactions on Signal Processing
Unveiling turbo codes: some results on parallel concatenated coding schemes
IEEE Transactions on Information Theory
Iterative decoding of binary block and convolutional codes
IEEE Transactions on Information Theory
Good error-correcting codes based on very sparse matrices
IEEE Transactions on Information Theory
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
The capacity of low-density parity-check codes under message-passing decoding
IEEE Transactions on Information Theory
Design of capacity-approaching irregular low-density parity-check codes
IEEE Transactions on Information Theory
Analysis of sum-product decoding of low-density parity-check codes using a Gaussian approximation
IEEE Transactions on Information Theory
Bounds on the performance of belief propagation decoding
IEEE Transactions on Information Theory
Extrinsic information transfer functions: model and erasure channel properties
IEEE Transactions on Information Theory
Bounds on information combining
IEEE Transactions on Information Theory
Extremes of information combining
IEEE Transactions on Information Theory
Constrained Information Combining: Theory and Applications for LDPC Coded Systems
IEEE Transactions on Information Theory
Information combining for relay networks
ICC'09 Proceedings of the 2009 IEEE international conference on Communications
Multiple-bases belief-propagation decoding of high-density cyclic codes
IEEE Transactions on Communications
Hi-index | 0.00 |
Consider coded transmission over a binary-input symmetric memoryless channel. The channel decoder uses the noisy observations of the code symbols to reproduce the transmitted code symbols. Thus, it combines the information about individual code symbols to obtain an over-all information about each code symbol, which may be the reproduced code symbol or its a-posteriori probability. This tutorial addresses the problem of "information combining" from an information-theory point of view: the decoder combines the mutual information between channel input symbols and channel output symbols (observations) to the mutual information between one transmitted symbol and all channel output symbols. The actual value of the combined information depends on the statistical structure of the channels. However, it can be upper and lower bounded for the assumed class of channels. This book first introduces the concept of mutual information profiles and revisits the well-known Jensen's inequality. Using these tools, the bounds on information combining are derived for single parity-check codes and for repetition codes. The application of the bounds is illustrated in four examples: information processing characteristics of coding schemes, including extrinsic information transfer (EXIT) functions; design of multiple turbo codes; bounds for the decoding threshold of low-density parity-check codes; EXIT function of the accumulator.