An integrated growing-pruning method for feedforward network training

  • Authors:
  • Pramod L. Narasimha;Walter H. Delashmit;Michael T. Manry;Jiang Li;Francisco Maldonado

  • Affiliations:
  • Department of Electrical Engineering, University of Texas at Arlington, Arlington, TX 76013, USA;Lockheed Martin, Missiles and Fire Control Dallas, Grand Prairie, TX 75051, USA;Department of Electrical Engineering, University of Texas at Arlington, Arlington, TX 76013, USA;Electrical and Computer Engineering, Old Dominion University, Norfolk, VA 23529, USA;Williams Pyro, Inc., 200 Greenleaf Street, Fort Worth, TX 76107, USA

  • Venue:
  • Neurocomputing
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

In order to facilitate complexity optimization in feedforward networks, several algorithms are developed that combine growing and pruning. First, a growing scheme is presented which iteratively adds new hidden units to full-trained networks. Then, a non-heuristic one-pass pruning technique is presented, which utilizes orthogonal least squares. Based upon pruning, a one-pass approach is developed for generating the validation error versus network size curve. A combined approach is described in which networks are continually pruned during the growing process. As a result, the hidden units are ordered according to their usefulness, and the least useful units are eliminated. Examples show that networks designed using the combined method have less training and validation error than growing or pruning alone. The combined method exhibits reduced sensitivity to the initial weights and generates an almost monotonic error versus network size curve. It is shown to perform better than two well-known growing methods-constructive backpropagation and cascade correlation.