Statistical and Inductive Inference by Minimum Message Length (Information Science and Statistics)
Statistical and Inductive Inference by Minimum Message Length (Information Science and Statistics)
The Minimum Description Length Principle (Adaptive Computation and Machine Learning)
The Minimum Description Length Principle (Adaptive Computation and Machine Learning)
Information and Complexity in Statistical Modeling
Information and Complexity in Statistical Modeling
Fisher information and stochastic complexity
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Strong optimality of the normalized ML models as universal codes and information in data
IEEE Transactions on Information Theory
Exact minimax strategies for predictive density estimation, data compression, and model selection
IEEE Transactions on Information Theory
Hi-index | 754.84 |
This paper considers the problem of constructing information theoretic universal models for data distributed according to the exponential distribution. The universal models examined include the sequential Normalized Maximum Likelihood (SNML) code, conditional normalized maximum likelihood (CNML) code, the minimum message length (MML) code, and the Bayes mixture code (BMC). The CNML code yields a codelength identical to the Bayesian mixture code, and within O(1) of the MML codelength, with suitable data driven priors.