Principles and practice of information theory
Principles and practice of information theory
Principles of Digital Communication and Coding
Principles of Digital Communication and Coding
Information Theory and Reliable Communication
Information Theory and Reliable Communication
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Analysis of Serial Turbo Codes over Abelian Groups for Symmetric Channels
SIAM Journal on Discrete Mathematics
Information, Physics, and Computation
Information, Physics, and Computation
Average Spectra and Minimum Distances of Low-Density Parity-Check Codes over Abelian Groups
SIAM Journal on Discrete Mathematics
The capacity of finite Abelian group codes over symmetric memoryless channels
IEEE Transactions on Information Theory
Modern Coding Theory
Group block codes over nonabelian groups are asymptotically bad
IEEE Transactions on Information Theory
The method of types [information theory]
IEEE Transactions on Information Theory
Minimal syndrome formers for group codes
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Random codes: minimum distances and error exponents
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
On the application of LDPC codes to arbitrary discrete-memoryless channels
IEEE Transactions on Information Theory
Asymptotic improvement of the Gilbert-Varshamov bound on the size of binary codes
IEEE Transactions on Information Theory
The dynamics of group codes: Dual abelian group codes and systems
IEEE Transactions on Information Theory
LDPC Codes Over Rings for PSK Modulation
IEEE Transactions on Information Theory
Design and analysis of nonbinary LDPC codes for arbitrary discrete-memoryless channels
IEEE Transactions on Information Theory
Linear block codes over cyclic groups
IEEE Transactions on Information Theory
Hi-index | 754.84 |
Typical minimum distances and error exponents are analyzed on the 8-PSK Gaussian channel for two capacity-achieving code ensembles with different algebraic structure. It is proved that the typical group code over the cyclic group of order eight achieves both the Gilbert-Varshamov bound and the expurgated error exponent. On the other hand, the typical binary-coset codes (under any labeling) is shown to be bounded away both from the Gilbert-Varshamov bound (at any rate) and the expurgated exponent (at low rates). The reason for this phenomenon is shown to rely on the symmetry structure of the 8-PSK constellation, which is known to match the cyclic group of order eight, but not the direct product of three copies of the binary group. The presented results indicate that designing group codes matching the symmetry of the channel guarantees better typical-code performance than designing codes whose algebraic structure does not match the channel. This contrasts the well-known fact that the average binary-coset code achieves both the capacity and the random-coding error exponent of any discrete memoryless channel.