Information Theory and Reliable Communication
Information Theory and Reliable Communication
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
The total capacity of two-user multiple-access channel with binary output
IEEE Transactions on Information Theory
Information-theoretic considerations for symmetric, cellular, multiple-access fading channels. I
IEEE Transactions on Information Theory
Information-theoretic considerations for symmetric, cellular, multiple-access fading channels. II
IEEE Transactions on Information Theory
Uplink channel capacity of space-division-multiple-access schemes
IEEE Transactions on Information Theory
Capacity results for the discrete memoryless network
IEEE Transactions on Information Theory
Towards an information theory of large networks: an achievable rate region
IEEE Transactions on Information Theory
Computation of total capacity for discrete memoryless multiple-access channels
IEEE Transactions on Information Theory
On capacity computation for the two-user binary multiple-access channel
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 4
On capacity computation for the two-user binary multiple-access channel: solutions by cooperation
Sarnoff'10 Proceedings of the 33rd IEEE conference on Sarnoff
Hi-index | 754.84 |
The necessary and sufficient condition of the channel capacity is rigorously formulated for the N -user discrete memoryless multiple-access channel (MAC). The essence is to invoke an elementary, MAC where sizes of input alphabets are not greater than the size of output alphabet. The main objective is to demonstrate that the channel capacity of an MAC is achieved by an elementary MAC included in the original MAC. The proof is quite straightforward by the very definition of the elementary MAC. The second objective is to prove that the Kuhn-Thcker conditions of the elementary MAC are sufficient (obviously necessary) for the channel capacity. The latter proof requires two distinctive properties of the MAC Every solution of the Kuhn-Thcker conditions is a local maximum on the domain of all possible input probability distributions (IPDs), and then particularly for the elementary MAC a set of IPDs for which the value of the mutual information is not smaller than the arbitrary positive number is connected on the domain. As a result, in respect of the channel capacity, the MAC in general can be regarded as an aggregate of a finite number of elementary MACs.