Advances in neural information processing systems 2
A penalty-function approach for pruning feedforward neural networks
Neural Computation
Pruning using parameter and neuronal metrics
Neural Computation
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
IEEE Transactions on Knowledge and Data Engineering
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
Advances in Neural Information Processing Systems 5, [NIPS Conference]
CARVE-a constructive algorithm for real-valued examples
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Reducing a neural network's complexity improves the ability of the network to generalize future examples. Like an overfitted regression function, neural networks may miss their target because of the excessive degrees of freedom stored up in unnecessary parameters. Over the past decade, the subject of pruning networks produced nonstatistical algorithms like Skeletonization, Optimal Brain Damage, and Optimal Brain Surgeon as methods to remove connections with the least salience. The method proposed here uses the bootstrap algorithm to estimate the distribution of the model parameter saliences. Statistical multiple comparison procedures are then used to make pruning decisions. We show this method compares well with Optimal Brain Surgeon in terms of ability to prune and the resulting network performance.