Statistical analysis with missing data
Statistical analysis with missing data
The Journal of Machine Learning Research
The minimum description length principle in coding and modeling
IEEE Transactions on Information Theory
An information criterion for likelihood selection
IEEE Transactions on Information Theory
Exact minimax strategies for predictive density estimation, data compression, and model selection
IEEE Transactions on Information Theory
Hi-index | 0.00 |
This paper addresses the problem of estimating the density of a future outcome from a multivariate normal model. We propose a class of empirical Bayes predictive densities and evaluate their performances under the Kullback-Leibler (KL) divergence. We show that these empirical Bayes predictive densities dominate the Bayesian predictive density under the uniform prior and thus are minimax under some general conditions. We also establish the asymptotic optimality of these empirical Bayes predictive densities in infinite-dimensional parameter spaces through an oracle inequality.