Performance evaluation of a partial retraining scheme for defective multi-layer neural networks

  • Authors:
  • Kunihito Yamamori;Toru Abe;Susumu Horiguchi

  • Affiliations:
  • Japan Advanced Institute of Science and Technology, 1-1 Asahi-Dai, Tatsunokuchi, Ishikawa 923-1292, Japan;Japan Advanced Institute of Science and Technology, 1-1 Asahi-Dai, Tatsunokuchi, Ishikawa 923-1292, Japan;Japan Advanced Institute of Science and Technology, 1-1 Asahi-Dai, Tatsunokuchi, Ishikawa 923-1292, Japan

  • Venue:
  • ACSAC '01 Proceedings of the 6th Australasian conference on Computer systems architecture
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper addresses an efficient stuck-defect compensation scheme for multi-layer artificial neural networks implemented in hardware devices. To compensate for stuck defects, we have proposed a two-stage partial retraining scheme that adjusts weights belonging to a neuron affected by defects based on back-propagation(BP) algorithm between two layers. For input neurons, the partial retraining scheme is applied two times; first-stage between the input layer and the hidden layer, second-stage between the hidden layer and the output layer. The partial retraining scheme does not need any additional circuits if the hardware neural network has circuits for learning. In this paper we discuss the performance of the partial retraining scheme, retraining time, network yield and generalization ability. As a result, the partial retraining scheme could compensate the neuron stuck defects about 10 times faster than the whole network retraining by BP algorithm. In addition, yields of networks are also improved. The partial retraining scheme achieved more than 80% recognition ratio for noisy input patterns when 16% neurons of the network have 0-stuck or 1-stuck defects.