Communications of the ACM
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
What size net gives valid generalization?
Neural Computation
Computational learning theory: an introduction
Computational learning theory: an introduction
Learning read-once formulas with queries
Journal of the ACM (JACM)
Lower Bound Methods and Separation Results for On-Line Learning Models
Machine Learning - Computational learning theory
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Neural nets with superlinear VC-dimension
Neural Computation
Bounding the Vapnik-Chervonenkis Dimension of Concept Classes Parameterized by Real Numbers
Machine Learning - Special issue on COLT '93
Sample sizes for threshold networks with equivalences
Information and Computation
Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
Journal of Computer and System Sciences - Special issue: dedicated to the memory of Paris Kanellakis
Neural networks with quadratic VC dimension
Journal of Computer and System Sciences - Special issue: dedicated to the memory of Paris Kanellakis
Vapnik-Chervonenkis dimension of neural networks
The handbook of brain theory and neural networks
On Learning µ-Perceptron Networks with Binary Weights
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Hi-index | 0.00 |
A neural network is said to be nonoverlapping if there isat most one edge outgoing from each node. We investigate the numberof examples that a learning algorithm needs when using nonoverlappingneural networks as hypotheses. We derive bounds for this samplecomplexity in terms of the Vapnik-Chervonenkis dimension. Inparticular, we consider networks consisting of threshold, sigmoidaland linear gates. We show that the class of nonoverlapping thresholdnetworks and the class of nonoverlapping sigmoidal networks on ninputs both have Vapnik-Chervonenkis dimension Ω(nlog n).This bound is asymptotically tight for the class of nonoverlappingthreshold networks. We also present an upper bound for this classwhere the constants involved are considerably smaller than in aprevious calculation. Finally, we argue that the Vapnik-Chervonenkisdimension of nonoverlapping threshold or sigmoidal networks cannotbecome larger by allowing the nodes to compute linear functions. Thissheds some light on a recent result that exhibited neural networkswith quadratic Vapnik-Chervonenkis dimension.