Comparative Fault Tolerance of Parallel Distributed Processing Networks
IEEE Transactions on Computers
Toward optimally distributed computation
Neural Computation
An Accurate Measure for Multilayer Perceptron Tolerance to Weight Deviations
Neural Processing Letters
Complete and partial fault tolerance of feedforward neural nets
IEEE Transactions on Neural Networks
Assessing the Noise Immunity of Radial Basis Function Neural Networks
IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Bio-inspired Applications of Connectionism-Part II
Assessing the Noise Immunity and Generalization of Radial Basis Function Networks
Neural Processing Letters
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
On the selection of weight decay parameter for faulty networks
IEEE Transactions on Neural Networks
On the objective function and learning algorithm for concurrent open node fault
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part III
Hi-index | 0.00 |
When the learning algorithm is applied to a MLP structure, different solutions for the weight values can be obtained if the parameters of the applied rule or the initial conditions are changed. Those solutions can present similar performance with respect to learning, but they differ in other aspects, in particular, fault tolerance against weight perturbations. In this paper, a backpropagation algorithm that maximizes fault tolerance is proposed. The algorithm presented explicitly adds a new term to the backpropagation learning rule related to the mean square error degradation in the presence of weight deviations in order to minimize this degradation. The results obtained demonstrate the efficiency of the learning rule proposed here in comparison with other algorithm.