Entropy and information theory
Entropy and information theory
Optimization by Vector Space Methods
Optimization by Vector Space Methods
Coding Theorems of Information Theory
Coding Theorems of Information Theory
Information Theory and Reliable Communication
Information Theory and Reliable Communication
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
On Limits of Wireless Communications in a Fading Environment when UsingMultiple Antennas
Wireless Personal Communications: An International Journal
Wireless Communications
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Distortion minimization in Gaussian layered broadcast coding with successive refinement
IEEE Transactions on Information Theory
Systematic lossy source/channel coding
IEEE Transactions on Information Theory
Fading channels: information-theoretic and communications aspects
IEEE Transactions on Information Theory
Multiaccess fading channels. II. Delay-limited capacities
IEEE Transactions on Information Theory
Mismatched decoding revisited: general alphabets, channels with memory, and the wide-band limit
IEEE Transactions on Information Theory
Capacity and optimal resource allocation for fading broadcast channels .II. Outage capacity
IEEE Transactions on Information Theory
The throughput of hybrid-ARQ protocols for the Gaussian collision channel
IEEE Transactions on Information Theory
Hybrid digital-analog (HDA) joint source-channel codes for broadcasting and robust communications
IEEE Transactions on Information Theory
Diversity and multiplexing: a fundamental tradeoff in multiple-antenna channels
IEEE Transactions on Information Theory
On the achievable throughput of a multiantenna Gaussian broadcast channel
IEEE Transactions on Information Theory
Sum capacity of the vector Gaussian broadcast channel and uplink-downlink duality
IEEE Transactions on Information Theory
A broadcast approach for a single-user slowly fading MIMO channel
IEEE Transactions on Information Theory
Duality, achievable rates, and sum-rate capacity of Gaussian MIMO broadcast channels
IEEE Transactions on Information Theory
Sum capacity of Gaussian vector broadcast channels
IEEE Transactions on Information Theory
Distortion Bounds for Broadcasting With Bandwidth Expansion
IEEE Transactions on Information Theory
The Capacity Region of the Gaussian Multiple-Input Multiple-Output Broadcast Channel
IEEE Transactions on Information Theory
Joint Source–Channel Codes for MIMO Block-Fading Channels
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
The source-channel separation theorem revisited
IEEE Transactions on Information Theory
Capacity limits of MIMO channels
IEEE Journal on Selected Areas in Communications
Hi-index | 754.84 |
We consider three capacity definitions for composite channels with channel side information at the receiver. A composite channel consists of a collection of different channels with a distribution characterizing the probability that each channel is in operation. The Shannon capacity of a channel is the highest rate asymptotically achievable with arbitrarily small error probability. Under this definition, the transmission strategy used to achieve the capacity must achieve arbitrarily small error probability for all channels in the collection comprising the composite channel. The resulting capacity is dominated by the worst channel in its collection, no matter how unlikely that channel is. We, therefore, broaden the definition of capacity to allow for some outage. The capacity versus outage is the highest rate asymptotically achievable with a given probability of decoder-recognized outage. The expected capacity is the highest average rate asymptotically achievable with a single encoder and multiple decoders, where channel side information determines the channel in use. The expected capacity is a generalization of capacity versus outage since codes designed for capacity versus outage decode at one of two rates (rate zero when the channel is in outage and the target rate otherwise) while codes designed for expected capacity can decode at many rates. Expected capacity equals Shannon capacity for channels governed by a stationary ergodic random process but is typically greater for general channels. The capacity versus outage and expected capacity definitions relax the constraint that all transmitted information must be decoded at the receiver. We derive channel coding theorems for these capacity definitions through information density and provide numerical examples to highlight their connections and differences. We also discuss the implications of these alternative capacity definitions for end-to-end distortion, source-channel coding, and separation.