The roots of backpropagation: from ordered derivatives to neural networks and political forecasting
The roots of backpropagation: from ordered derivatives to neural networks and political forecasting
Neural network design
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The generalized multilayer perceptron (gMLP) augments the connections in the multilayered perceptron (MLP) architecture to include all possible non-recurrent connections. The layered arbitrarily connected network (IACN) has connections from input nodes to output nodes in addition to the connections included in a MLP. In this paper the performance of MLP, IACN and gMLP networks trained using the Levenberg-Marquardt method are compared. A number of different function approximation tasks were examined. The effect of varying the number of hidden layer neurons, the error termination condition, and the training set size were also evaluated. The results presented here represent preliminary findings. In particular, additional testing on benchmark real data sets is needed.