Univariate Polynomial Inference by Monte Carlo Message Length Approximation
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Statistical and Inductive Inference by Minimum Message Length (Information Science and Statistics)
Statistical and Inductive Inference by Minimum Message Length (Information Science and Statistics)
Information and Complexity in Statistical Modeling
Information and Complexity in Statistical Modeling
IEEE Transactions on Signal Processing
A small sample model selection criterion based on Kullback's symmetric divergence
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
This paper derives two new information theoretic linear regression criteria based on the minimum message length principle. Both criteria are invariant to full rank affine transformations of the design matrix and yield estimates that are minimax with respect to squared error loss. The new criteria are compared against state of the art information theoretic model selection criteria on both real and synthetic data and show good performance in all cases.