Communications of the ACM
Classifying learnable geometric concepts with the Vapnik-Chervonenkis dimension
STOC '86 Proceedings of the eighteenth annual ACM symposium on Theory of computing
Reconstructing Convex Sets from Support Line Measurements
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learnability by fixed distributions
COLT '88 Proceedings of the first annual workshop on Computational learning theory
Problems of computational and information complexity in machine vision and learning
Problems of computational and information complexity in machine vision and learning
DECISION THEORETIC GENERALIZATIONS OF THE PAC MODEL FORNEURAL NET AND OTHER LEARNING APPLICATIONS
DECISION THEORETIC GENERALIZATIONS OF THE PAC MODEL FORNEURAL NET AND OTHER LEARNING APPLICATIONS
Geometric probing
Reconstructing objects from projections
Reconstructing objects from projections
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
On probably correct classification of concepts
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Hi-index | 0.00 |
In this paper, we introduce an extension of the standard PAC learning model which allows the use of generalized samples. We view a generalized sample as a pair consisting of a functional on the concept class together with the value obtained by the functional operating on the unknown concept. It appears that this model can be applied to a number of problems in signal processing and geometric reconstruction to provide sample size bounds under a PAC criterion. We consider a specific application of the model to a problem of curve reconstruction, and discuss some connections with a result from stochastic geometry.