Structural simplification of a feed-forward, multilayer perceptron artificial neural network

  • Authors:
  • Y. H. Hu;Q. Xue;W. J. Tompkins

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., Wisconsin Univ., WI, USA;Dept. of Electr. & Comput. Eng., Wisconsin Univ., WI, USA;Dept. of Electr. & Comput. Eng., Wisconsin Univ., WI, USA

  • Venue:
  • ICASSP '91 Proceedings of the Acoustics, Speech, and Signal Processing, 1991. ICASSP-91., 1991 International Conference
  • Year:
  • 1991

Quantified Score

Hi-index 0.00

Visualization

Abstract

Several methods to reduce the excessive number of neurons and synaptic weights in a feedforward, multilayer perceptron artificial neural network (ANN) are presented. To reduce the synaptic weights, the authors replace the original weight matrix by a product of two smaller matrices so that the number of multiplications required can be reduced. To reduce the hidden units, they exploit the correlation among the outputs of the hidden neurons in the same layer. A method to identify and remove redundant hidden units and update the weights of the remaining neurons is proposed. This approach offers potentially good performance without retraining. When retraining is applied to fine-tune the reduced network, the updated weights become very good initial conditions enabling much faster training compared with training with random initial conditions.