Improved lower bounds for learning from noisy examples: an information-theoretic approach
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Hi-index | 754.84 |
The article focuses on lower bound results on expected redundancy for universal coding of independent and identically distributed data on [0, 1] from parametric and nonparametric families. After reviewing existing lower bounds, we provide a new proof for minimax lower bounds on expected redundancy over nonparametric density classes. This new proof is based on the calculation of a mutual information quantity, or it utilizes the relationship between redundancy and Shannon capacity. It therefore unifies the minimax redundancy lower bound proofs in the parametric and nonparametric cases