On the secrecy rate of interference networks using structured codes
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 3
Power-constrained communications using LDLC lattices
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
IEEE Transactions on Information Theory
AAECC'07 Proceedings of the 17th international conference on Applied algebra, algebraic algorithms and error-correcting codes
Modified high-order PAMs for binary coded physical-layer network coding
IEEE Communications Letters
The approximate capacity of the many-to-one and one-to-many Gaussian interference channels
IEEE Transactions on Information Theory
Generalized superposition modulation and iterative demodulation: a capacity investigation
Journal of Electrical and Computer Engineering - Special issue on iterative signal processing in communications
Hi-index | 754.96 |
General random coding theorems for lattices are derived from the Minkowski-Hlawka theorem and their close relation to standard averaging arguments for linear codes over finite fields is pointed out. A new version of the Minkowski-Hlawka theorem itself is obtained as the limit, for p→∞, of a simple lemma for linear codes over GF(p) used with p-level amplitude modulation. The relation between the combinatorial packing of solid bodies and the information-theoretic “soft packing” with arbitrarily small, but positive, overlap is illuminated. The “soft-packing” results are new. When specialized to the additive white Gaussian noise channel, they reduce to (a version of) the de Buda-Poltyrev result that spherically shaped lattice codes and a decoder that is unaware of the shaping can achieve the rate 1/2 log2 (P/N)