A Note on Sample Complexity of Learning Binary Output Neural Networks under Fixed Input Distributions

  • Authors:
  • Vladimir Pestov

  • Affiliations:
  • -

  • Venue:
  • SBRN '10 Proceedings of the 2010 Eleventh Brazilian Symposium on Neural Networks
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

We show that the learning sample complexity of a sigmoidal neural network constructed by Sontag (1992) required to achieve a given misclassification error under a fixed purely atomic distribution can grow arbitrarily fast: for any prescribed rate of growth there is an input distribution having this rate as the sample complexity, and the bound is asymptotically tight. The rate can be super exponential, a non-recursive function, etc. We further observe that Sontag's ANN is not Glivenko–Cantelli under any input distribution having a non-atomic part.