Performance of generalized multi-layered perceptrons and layered arbitrarily connected networks trained using the Levenberg-Marquardt method

  • Authors:
  • Steele A. Russell;Anthony S. Maida

  • Affiliations:
  • Department of Computer Science and Industrial Technology at Southeastern Louisiana University, Hammond, Louisiana and Center For Advanced Computer Studies at the University of Louisiana at Lafayet ...;Center For Advanced Computer Studies at the University of Louisiana at Lafayette, Lafayette, Louisiana

  • Venue:
  • IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The generalized multilayer perceptron (gMLP) augments the connections in the multilayered perceptron (MLP) architecture to include all possible non-recurrent connections. The layered arbitrarily connected network (IACN) has connections from input nodes to output nodes in addition to the connections included in a MLP. In this paper the performance of MLP, IACN and gMLP networks trained using the Levenberg-Marquardt method are compared. A number of different function approximation tasks were examined. The effect of varying the number of hidden layer neurons, the error termination condition, and the training set size were also evaluated. The results presented here represent preliminary findings. In particular, additional testing on benchmark real data sets is needed.