Communications of the ACM
Classifying learnable geometric concepts with the Vapnik-Chervonenkis dimension
STOC '86 Proceedings of the eighteenth annual ACM symposium on Theory of computing
Reconstructing Convex Sets from Support Line Measurements
IEEE Transactions on Pattern Analysis and Machine Intelligence
On the computational complexity of approximating distributions by probabilistic automata
COLT '90 Proceedings of the third annual workshop on Computational learning theory
Learnability by fixed distributions
COLT '88 Proceedings of the first annual workshop on Computational learning theory
Problems of computational and information complexity in machine vision and learning
Problems of computational and information complexity in machine vision and learning
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Geometric probing
Reconstructing objects from projections
Reconstructing objects from projections
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
PAC learnability of rough hypercuboid classifier
ICIC'12 Proceedings of the 8th international conference on Intelligent Computing Theories and Applications
Hi-index | 0.14 |
An extension of the standard probably approximately correct (PAC) learning model that allows the use of generalized samples is introduced. A generalized sample is viewed as a pair consisting of a functional on the concept class together with the value obtained by the functional operating on the unknown concept. It appears that this model can be applied to a number of problems in signal processing and geometric reconstruction to provide sample size bounds under a PAC criterion. A specific application of the generalized model to a problem of curve reconstruction is considered, and some connections with a result from stochastic geometry are discussed.