Communications of the ACM
Information Processing Letters
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
A Learning Criterion for Stochastic Rules
Machine Learning - Computational learning theory
On the Computational Complexity of Approximating Distributions by Probabilistic Automata
Machine Learning - Computational learning theory
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Learning non-parametric smooth rules by stochastic rules with finite partitioning
Euro-COLT '93 Proceedings of the first European conference on Computational learning theory
Stochastic Complexity in Statistical Inquiry Theory
Stochastic Complexity in Statistical Inquiry Theory
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Hi-index | 0.00 |
This paper proposes the minimum L-complexity algorithm (MLC), which can be thought of as an extension of the minimum description length (MDL) principle-based algorithm to the case where general real-valued functions are used as hypotheses and general loss functions are used as distortion measures. MLC is also closely related to Barron's complexity regularization algorithm and Vapnik's structural risk minimization. We demonstrate the effectiveness of MLC in terms of sample complexity within the decision theoretic PAC learning model. Specifically using MLC, we develop a unifying method of deriving upper bounds on target-dependent (non-uniform) sample complexity both for parametric and non-parametric settings. We further introduce a method for evaluating average-case sample complexity where the average is taken with respect to a prior probability over the parametric target class. These target-dependent and average-case sample complexity bounds offer a new view of sample complexity analysis, while most of previous work focusing on the worst-case sample complexity. As applications of MLC, we consider the issues of learning and non-parametric rules in terms of 1) stochastic rules with finite partitioning, 2) finite Hermite series, and 3) finite Fourier series. We use MLC to improve the previously-known best results on sample complexity for these issues.