Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Hi-index | 0.00 |
This paper addresses an efficient stuck-defect compensation scheme for multi-layer artificial neural networks implemented in hardware devices. To compensate for stuck defects, we have proposed a two-stage partial retraining scheme that adjusts weights belonging to a neuron affected by defects based on back-propagation(BP) algorithm between two layers. For input neurons, the partial retraining scheme is applied two times; first-stage between the input layer and the hidden layer, second-stage between the hidden layer and the output layer. The partial retraining scheme does not need any additional circuits if the hardware neural network has circuits for learning. In this paper we discuss the performance of the partial retraining scheme, retraining time, network yield and generalization ability. As a result, the partial retraining scheme could compensate the neuron stuck defects about 10 times faster than the whole network retraining by BP algorithm. In addition, yields of networks are also improved. The partial retraining scheme achieved more than 80% recognition ratio for noisy input patterns when 16% neurons of the network have 0-stuck or 1-stuck defects.