An information-theoretic model for steganography
Information and Computation
Sensing capacity for discrete sensor network applications
IPSN '05 Proceedings of the 4th international symposium on Information processing in sensor networks
Geometric programming for communication systems
Communications and Information Theory
Exponential bounds and stopping rules for MCMC and general Markov chains
valuetools '06 Proceedings of the 1st international conference on Performance evaluation methodolgies and tools
Reliability criteria in information theory and in statistical hypothesis testing
Foundations and Trends in Communications and Information Theory
DDoS-shield: DDoS-resilient scheduling to counter application layer attacks
IEEE/ACM Transactions on Networking (TON)
Error exponents for asymmetric two-user discrete memoryless source-channel coding systems
IEEE Transactions on Information Theory
On the duality between Slepian-Wolf coding and channel coding under mismatched decoding
IEEE Transactions on Information Theory
Outage behavior of discrete memoryless channels under channel estimation errors
IEEE Transactions on Information Theory
On arbitrarily varying Markov source coding and hypothesis LAO testing by non-informed statistician
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
On the redundancy of Slepian--Wolf coding
IEEE Transactions on Information Theory
Bandwidth optimal steganography secure against adaptive chosen stegotext attacks
IH'06 Proceedings of the 8th international conference on Information hiding
Interference channel capacity region for randomized fixed-composition codes
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Robust detection of random variables using sparse measurements
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
The Pólya information divergence
Information Sciences: an International Journal
Twice-universal simulation of Markov sources and individual sequences
IEEE Transactions on Information Theory
Group codes outperform binary-coset codes on nonbinary symmetric memoryless channels
IEEE Transactions on Information Theory
On information divergence measures and a unified typicality
IEEE Transactions on Information Theory
An information-spectrum approach to analysis of return maximization in reinforcement learning
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: theory and algorithms - Volume Part I
Anytime Reliable Transmission of Real-Valued Information through Digital Noisy Channels
SIAM Journal on Control and Optimization
On logarithmically asymptotically optimal testing of hypotheses and identification
General Theory of Information Transfer and Combinatorics
Optimal information measures for weakly chaotic dynamical systems
General Theory of Information Transfer and Combinatorics
Multiple objects: error exponents in hypotheses testing and identification
Information Theory, Combinatorics, and Search Theory
Hi-index | 755.26 |
The method of types is one of the key technical tools in Shannon theory, and this tool is valuable also in other fields. In this paper, some key applications are presented in sufficient detail enabling an interested nonspecialist to gain a working knowledge of the method, and a wide selection of further applications are surveyed. These range from hypothesis testing and large deviations theory through error exponents for discrete memoryless channels and capacity of arbitrarily varying channels to multiuser problems. While the method of types is suitable primarily for discrete memoryless models, its extensions to certain models with memory are also discussed