PAC learning with generalized samples and an application to stochastic geometry
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Learning stochastic functions by smooth simultaneous estimation
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
On learning noisy threshold functions with finite precision weights
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Characterizations of learnability for classes of {O, …, n}-valued functions
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
On learning in the limit and non-uniform (&egr;,&dgr;)-learning
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Acceleration of learning in binary choice problems
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Learning from a mixture of labeled and unlabeled examples with parametric side information
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Analysis of greedy expert hiring and an application to memory-based learning (extended abstract)
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
Estimation of time-varying parameters in statistical models: an optimization approach
COLT '97 Proceedings of the tenth annual conference on Computational learning theory
Efficient learning of monotone concepts via quadratic optimization
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Learning curves for error minimum and maximum likelihood algorithms
Neural Computation
Hi-index | 0.00 |
We describe a generalization of the PAC learning model that is based on statistical decision theory. In this model the learner receives randomly drawn examples, each example consisting of an instance x in X and an outcome y in Y , and tries to find a hypothesis h : X --