Letters: Support vector perceptrons
Neurocomputing
Hi-index | 0.00 |
Several methods to reduce the excessive number of neurons and synaptic weights in a feedforward, multilayer perceptron artificial neural network (ANN) are presented. To reduce the synaptic weights, the authors replace the original weight matrix by a product of two smaller matrices so that the number of multiplications required can be reduced. To reduce the hidden units, they exploit the correlation among the outputs of the hidden neurons in the same layer. A method to identify and remove redundant hidden units and update the weights of the remaining neurons is proposed. This approach offers potentially good performance without retraining. When retraining is applied to fine-tune the reduced network, the updated weights become very good initial conditions enabling much faster training compared with training with random initial conditions.