Geometrical synthesis of MLP neural networks
Neurocomputing
Constructive approximate interpolation by neural networks
Journal of Computational and Applied Mathematics
Approximate models for nonlinear dynamical systems and their generalization properties
Mathematical and Computer Modelling: An International Journal
Classification of underwater signals using wavelet transforms and neural networks
Mathematical and Computer Modelling: An International Journal
Object recognition using a neural network with optimal feature extraction
Mathematical and Computer Modelling: An International Journal
Hi-index | 0.00 |
A new derivation is presented for the bounds on the size of a multilayer neural network to exactly implement an arbitrary training set; namely the training set can be implemented with zero error with two layers and with the number of the hidden-layer neurons equal to #1⩾ p-1. The derivation does not require the separation of the input space by particular hyperplanes, as in previous derivations. The weights for the hidden layer can be chosen almost arbitrarily, and the weights for the output layer can be found by solving #1+1 linear equations. The method presented exactly solves (M), the multilayer neural network training problem, for any arbitrary training set