On the Sample Complexity for Nonoverlapping Neural Networks

  • Authors:
  • Michael Schmitt

  • Affiliations:
  • Ruhr-Universität Bochum, Fakultät für Mathematik, Lehrstuhl Mathematik und Informatik, D-44780 Bochum, Germany. mschmitt@lmi.ruhr-uni-bochum.de

  • Venue:
  • Machine Learning
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

A neural network is said to be nonoverlapping if there isat most one edge outgoing from each node. We investigate the numberof examples that a learning algorithm needs when using nonoverlappingneural networks as hypotheses. We derive bounds for this samplecomplexity in terms of the Vapnik-Chervonenkis dimension. Inparticular, we consider networks consisting of threshold, sigmoidaland linear gates. We show that the class of nonoverlapping thresholdnetworks and the class of nonoverlapping sigmoidal networks on ninputs both have Vapnik-Chervonenkis dimension Ω(nlog n).This bound is asymptotically tight for the class of nonoverlappingthreshold networks. We also present an upper bound for this classwhere the constants involved are considerably smaller than in aprevious calculation. Finally, we argue that the Vapnik-Chervonenkisdimension of nonoverlapping threshold or sigmoidal networks cannotbecome larger by allowing the nodes to compute linear functions. Thissheds some light on a recent result that exhibited neural networkswith quadratic Vapnik-Chervonenkis dimension.