Radial basis function approximations to polynomials
Numerical analysis 1987
On the capabilities of multilayer perceptrons
Journal of Complexity - Special Issue on Neural Computation
On the storage capacity of nonlinear neural networks
Neural Networks
Neural Networks: A Comprehensive Foundation (3rd Edition)
Neural Networks: A Comprehensive Foundation (3rd Edition)
Capacity of two-layer feedforward neural networks with binary weights
IEEE Transactions on Information Theory
Information capacity of the Hopfield model
IEEE Transactions on Information Theory
IEEE Transactions on Neural Networks
Learning capability and storage capacity of two-hidden-layer feedforward networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
An upper bound on pattern storage is stated for nonlinear feedforward networks with analytic activation functions, like the multilayer perceptron and radial basis function network. The bound is given in terms of the number of network weights, and applies to networks having any number of output nodes and arbitrary connectivity. Starting from the strict interpolation equations and exact finite degree polynomial models for the hidden units, a straightforward proof by contradiction is developed for the upper bound. Several networks, trained by conjugate gradient, are used to demonstrate the tightness of the bound for random patterns.