Adaptive fuzzy systems and control: design and stability analysis
Adaptive fuzzy systems and control: design and stability analysis
Comparative Fault Tolerance of Parallel Distributed Processing Networks
IEEE Transactions on Computers
Training with noise is equivalent to Tikhonov regularization
Neural Computation
Neural Computation
Toward optimally distributed computation
Neural Computation
An Accurate Measure for Multilayer Perceptron Tolerance to Weight Deviations
Neural Processing Letters
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Complete and partial fault tolerance of feedforward neural nets
IEEE Transactions on Neural Networks
A Measure of Noise Immunity for Functional Networks
IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Connectionist Models of Neurons, Learning Processes and Artificial Intelligence-Part I
Assessing the Noise Immunity of Radial Basis Function Neural Networks
IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Bio-inspired Applications of Connectionism-Part II
Assessing the Noise Immunity and Generalization of Radial Basis Function Networks
Neural Processing Letters
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
On the selection of weight decay parameter for faulty networks
IEEE Transactions on Neural Networks
Generalization error of faulty MLPs with weight decay regularizer
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: models and applications - Volume Part II
Hi-index | 0.00 |
An analysis of the influence of weight and input perturbations in a multilayer perceptron (MLP) is made in this article. Quantitative measurements of fault tolerance, noise immunity, and generalization ability are provided. From the expressions obtained, it is possible to justify some previously reported conjectures and experimentally obtained results (e.g., the influence of weight magnitudes, the relation between training with noise and the generalization ability, the relation between fault tolerance and the generalization ability). The measurements introduced here are explicitly related to the mean squared error degradation in the presence of perturbations, thus constituting a selection criterion between different alternatives of weight configurations. Moreover, they allow us to predict the degradation of the learning performance of an MLP when its weights or inputs are deviated from their nominal values and thus, the behavior of a physical implementation can be evaluated before the weights are mapped on it according to its accuracy.