Elements of information theory
Elements of information theory
Stochastic Complexity in Statistical Inquiry Theory
Stochastic Complexity in Statistical Inquiry Theory
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information projections revisited
IEEE Transactions on Information Theory
The context-tree weighting method: basic properties
IEEE Transactions on Information Theory
Reliability criteria in information theory and in statistical hypothesis testing
Foundations and Trends in Communications and Information Theory
Structure compilation: trading structure for features
Proceedings of the 25th international conference on Machine learning
Game theoretical optimization inspired by information theory
Journal of Global Optimization
Mutual Information Analysis: How, When and Why?
CHES '09 Proceedings of the 11th International Workshop on Cryptographic Hardware and Embedded Systems
Universal Estimation of Information Measures for Analog Sources
Foundations and Trends in Communications and Information Theory
Distributed detection with censoring sensors under physical layer secrecy
IEEE Transactions on Signal Processing
The global kernel k-means algorithm for clustering in feature space
IEEE Transactions on Neural Networks
Block pickard models for two-dimensional constraints
IEEE Transactions on Information Theory
A simple converse of Burnashev's reliability function
IEEE Transactions on Information Theory
Convex Mixture Models for Multi-view Clustering
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part II
Embedding maximum entropy models in algebraic varieties by Gröbner bases methods
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 3
Extending models for two-dimensional constraints
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
Divergence from factorizable distributions and matroid representations by partitions
IEEE Transactions on Information Theory
Learning explicit and implicit visual manifolds by information projection
Pattern Recognition Letters
Squeezing the Arimoto-Blahut algorithm for faster convergence
IEEE Transactions on Information Theory
The Pólya information divergence
Information Sciences: an International Journal
Thinning, entropy, and the law of thin numbers
IEEE Transactions on Information Theory
Nonparametric statistical inference for ergodic processes
IEEE Transactions on Information Theory
An algebraic implicitization and specialization of minimum KL-divergence models
CASC'10 Proceedings of the 12th international conference on Computer algebra in scientific computing
Linear universal decoding for compound channels
IEEE Transactions on Information Theory
Learning shape detector by quantizing curve segments with multiple distance metrics
ECCV'10 Proceedings of the 11th European conference on computer vision conference on Computer vision: Part III
Estimating class proportions in boar semen analysis using the hellinger distance
IEA/AIE'10 Proceedings of the 23rd international conference on Industrial engineering and other applications of applied intelligent systems - Volume Part I
Information theory in computer graphics and visualization
SIGGRAPH Asia 2011 Courses
Asymptotic learnability of reinforcement problems with arbitrary dependence
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
Application of nonlinear filtering to credit risk
Operations Research Letters
Sequential minimal optimization in convex clustering repetitions
Statistical Analysis and Data Mining
Class distribution estimation based on the Hellinger distance
Information Sciences: an International Journal
Multiple objects: error exponents in hypotheses testing and identification
Information Theory, Combinatorics, and Search Theory
Search for sparse active inputs: a review
Information Theory, Combinatorics, and Search Theory
Hi-index | 0.42 |
This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting. The information measure known as information divergence or Kullback-Leibler distance or relative entropy plays a key role, often with a geometric flavor as an analogue of squared Euclidean distance, as in the concepts of I-projection, I-radius and I-centroid. The topics covered include large deviations, hypothesis testing, maximum likelihood estimation in exponential families, analysis of contingency tables, and iterative algorithms with an "information geometry" background. Also, an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory.