Principles and practice of information theory
Principles and practice of information theory
Elements of information theory
Elements of information theory
Information Theory and Reliable Communication
Information Theory and Reliable Communication
The capacity of channels with feedback
IEEE Transactions on Information Theory
The compound channel capacity of a class of finite-state channels
IEEE Transactions on Information Theory
Capacity of Markov channels with receiver CSI and delayed feedback
IEEE Transactions on Information Theory
On the capacity of some channels with channel state information
IEEE Transactions on Information Theory
Capacity results for the discrete memoryless network
IEEE Transactions on Information Theory
On competitive prediction and its relation to rate-distortion theory
IEEE Transactions on Information Theory
The capacity of finite-State Markov Channels With feedback
IEEE Transactions on Information Theory
Feedback capacity of finite-state machine channels
IEEE Transactions on Information Theory
Source Coding With Feed-Forward: Rate-Distortion Theorems and Error Exponents for a General Source
IEEE Transactions on Information Theory
A Coding Theorem for a Class of Stationary Channels With Feedback
IEEE Transactions on Information Theory
Achieving the Gaussian Rate–Distortion Function by Prediction
IEEE Transactions on Information Theory
The source-channel separation theorem revisited
IEEE Transactions on Information Theory
Capacity region of the finite-state multiple-access channel with and without feedback
IEEE Transactions on Information Theory
Feedback capacity of the compound channel
IEEE Transactions on Information Theory
Some observations on limited feedback for multiaccess channels
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Directed information, causal estimation, and communication in continuous time
WiOPT'09 Proceedings of the 7th international conference on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks
Feedback capacity of a class of symmetric finite-state Markov channels
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
A little feedback can simplify sensor network cooperation
IEEE Journal on Selected Areas in Communications - Special issue on simple wireless sensor networking solutions
Tighter bounds on the capacity of finite-state channels via Markov set-chains
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Wideband fading channels with feedback
IEEE Transactions on Information Theory
Journal of Computational Neuroscience
Hi-index | 755.14 |
We consider capacity of discrete-time channels with feedback for the general case where the feedback is a time-invariant deterministic function of the output samples. Under the assumption that the channel states take values in a finite alphabet, we find a sequence of achievable rates and a sequence of upper bounds on the capacity. The achievable rates and the upper bounds are computable for any N, and the limits of the sequences exist. We show that when the probability of the initial state is positive for all the channel states, then the capacity is the limit of the achievable-rate sequence. We further show that when the channel is stationary, indecomposable, and has no intersymbol interference (ISI), its capacity is given by the limit of the maximum of the (normalized) directed information between the input XN and the output YN, i.e., C = limN → ∞ 1/N max I(XN → YN) where the maximization is taken over the causal conditioning probability Q(xN∥zN-1) defined in this paper. The main idea for obtaining the results is to add causality into Gallager's results on finite state channels. The capacity results are used to show that the source-channel separation theorem holds for time-invariant determinist feedback, and if the state of the channel is known both at the encoder and the decoder, then feedback does not increase capacity.