Neural network pruning with Tukey-Kramer multiple comparison procedure

  • Authors:
  • Donald E. Duckro;Dennis W. Quinn;Samuel J. Gardner, III

  • Affiliations:
  • Air Force Institute of Technology, Department of Mathematics and Statistics, Wright-Patterson Air Force Base, Ohio;Air Force Institute of Technology, Department of Mathematics and Statistics, Wright-Patterson Air Force Base, Ohio;Air Force Institute of Technology, Department of Mathematics and Statistics, Wright-Patterson Air Force Base, Ohio

  • Venue:
  • Neural Computation
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Reducing a neural network's complexity improves the ability of the network to generalize future examples. Like an overfitted regression function, neural networks may miss their target because of the excessive degrees of freedom stored up in unnecessary parameters. Over the past decade, the subject of pruning networks produced nonstatistical algorithms like Skeletonization, Optimal Brain Damage, and Optimal Brain Surgeon as methods to remove connections with the least salience. The method proposed here uses the bootstrap algorithm to estimate the distribution of the model parameter saliences. Statistical multiple comparison procedures are then used to make pruning decisions. We show this method compares well with Optimal Brain Surgeon in terms of ability to prune and the resulting network performance.