Bounds on the number of samples needed for neural learning

  • Authors:
  • K. G. Mehrotra;C. K. Mohan;S. Ranka

  • Affiliations:
  • Sch. of Comput. & Inf. Sci., Syracuse Univ., NY;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1991

Quantified Score

Hi-index 0.00

Visualization

Abstract

The relationship between the number of hidden nodes in a neural network, the complexity of a multiclass discrimination problem, and the number of samples needed for effect learning are discussed. Bounds for the number of samples needed for effect learning are given. It is shown that Ω(min (d,n) M) boundary samples are required for successful classification of M clusters of samples using a two-hidden-layer neural network with d-dimensional inputs and n nodes in the first hidden layer