How to Achieve Minimax Expected Kullback-Leibler Distance from an Unknown Finite Distribution
ALT '02 Proceedings of the 13th International Conference on Algorithmic Learning Theory
Laplace's law of succession and universal encoding
IEEE Transactions on Information Theory
Hi-index | 0.00 |
When learning processes depend on samples but not on the order of the information in the sample, then the Bernoulli distribution is relevant and Bernstein polynomials enter into the analysis. We derive estimates of the approximation of the entropy function xlogx that are sharper than the bounds from Voronovskaja's theorem. In this way we get the correct asymptotics for the Kullback-Leibler distance for an encoding problem.