Exploring constructive cascade networks

  • Authors:
  • N. K. Treadgold;T. D. Gedeon

  • Affiliations:
  • Dept. of Comput. Sci. & Eng., New South Wales Univ., Kensington, NSW;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

Constructive algorithms have proved to be powerful methods for training feedforward neural networks. An important property of these algorithms is generalization. A series of empirical studies were performed to examine the effect of regularization on generalization in constructive cascade algorithms. It was found that the combination of early stopping and regularization resulted in better generalization than the use of early stopping alone. A cubic penalty term that greatly penalizes large weights was shown to be beneficial for generalization in cascade networks. An adaptive method of setting the regularization magnitude in constructive algorithms was introduced and shown to produce generalization results similar to those obtained with a fixed, user-optimized regularization setting. This adaptive method also resulted in the construction of smaller networks for more complex problems. The acasper algorithm, which incorporates the insights obtained from the empirical studies, was shown to have good generalization and network construction properties. This algorithm was compared to the cascade correlation algorithm on the Proben 1 and additional regression data sets