The cascade-correlation learning architecture
Advances in neural information processing systems 2
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Geometrical interpretation and architecture selection of MLP
IEEE Transactions on Neural Networks
The AIC Criterion and Symmetrizing the Kullback–Leibler Divergence
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper we present a novel method for pruning redundant weights of a trained multilayer Perceptron (MLP). The proposed method is based on the correlation analysis of the errors produced by the output neurons and the backpropagated errors associated with the hidden neurons. Repeated applications of it leads eventually to the complete elimination of all connections of a neuron. Simulations using real-world data indicate that, in terms of performance, the proposed method compares favorably with standard pruning techniques, such as the Optimal Brain Surgeon (OBS) and Weight Decay and Elimination (WDE), but with much lower computational costs.