A general lower bound on the number of examples needed for learning
Information and Computation
Universal approximation using radial-basis-function networks
Neural Computation
VC dimension and uniform learnability of sparse polynomials and rational functions
SIAM Journal on Computing
Lower bounds on the VC dimension of smoothly parameterized function classes
Neural Computation
Lower bound on VC-dimension by local shattering
Neural Computation
Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
Journal of Computer and System Sciences - Special issue: dedicated to the memory of Paris Kanellakis
Neural networks with quadratic VC dimension
Journal of Computer and System Sciences - Special issue: dedicated to the memory of Paris Kanellakis
Localization vs. Identification of Semi-Algebraic Sets
Machine Learning
Almost linear VC-dimension bounds for piecewise polynomial networks
Neural Computation
Vapnik-Chervonenkis dimension of neural networks
The handbook of brain theory and neural networks
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Neural networks with local receptive fields and superlinear VC Dimension
Neural Computation
Descartes' rule of signs for radial basis function neural networks
Neural Computation
Neural Networks: A Comprehensive Foundation (3rd Edition)
Neural Networks: A Comprehensive Foundation (3rd Edition)
Learning to identify winning coalitions in the PAC model
AAMAS '06 Proceedings of the fifth international joint conference on Autonomous agents and multiagent systems
Hi-index | 0.00 |
Higher-order neurons with k monomials in n variables are shown to have Vapnik-Chervonenkis (VC) dimension at least nk + 1. This result supersedes the previously known lower bound obtained via k-term monotone disjunctive normal form (DNF) formulas. Moreover, it implies that the VC dimension of higher-order neurons with k monomials is strictly larger than the VC dimension of k-term monotone DNF. The result is achieved by introducing an exponential approach that employs gaussian radial basis function neural networks for obtaining classifications of points in terms of higher-order neurons.