The roots of backpropagation: from ordered derivatives to neural networks and political forecasting
The roots of backpropagation: from ordered derivatives to neural networks and political forecasting
Neural network design
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The generalized multilayer perceptron (gMLP) generalizes the multilayered perceptron (MLP) architecture to a fully connected feedforward architecture where connections are not restricted to adjacent layers. In this paper the performance of MLP and gMLP networks trained using the Levenberg-Marquardt method are compared. A number of different function approximation tasks were examined. The effect of varying the number of hidden layer neurons was also evaluated.