Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Machine learning: a theoretical approach
Machine learning: a theoretical approach
Learnability with respect to fixed distributions
Theoretical Computer Science
Computational learning theory: an introduction
Computational learning theory: an introduction
The nature of statistical learning theory
The nature of statistical learning theory
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
Learning from Data: Concepts, Theory, and Methods
Learning from Data: Concepts, Theory, and Methods
Some Improved Sample Complexity Bounds in the Probabilistic PAC Learning Model
ALT '92 Proceedings of the Third Workshop on Algorithmic Learning Theory
PAC Learning under Helpful Distributions
ALT '97 Proceedings of the 8th International Conference on Algorithmic Learning Theory
PAC Learning Using Nadaraya-Watson Estimator Based on Orthonormal Systems
ALT '97 Proceedings of the 8th International Conference on Algorithmic Learning Theory
On the Relationship between Generalization Error, Hypothesis Complexity, and Sample Complexity for Radial Basis Functions
A Theory of Networks for Approximation and Learning
A Theory of Networks for Approximation and Learning
Evolution of functional link networks
IEEE Transactions on Evolutionary Computation
Generalization and PAC learning: some new results for the class of generalized single-layer networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We find new sample complexity bounds for real function learning tasks in the uniform distribution. These bounds, tighter than the distribution-free ones reported elsewhere in the literature, are applicable to simple functional link networks and radial basis neural networks.