Perturbation response in feedforward networks
Neural Networks
Training with noise is equivalent to Tikhonov regularization
Neural Computation
Neural Computation
Toward optimally distributed computation
Neural Computation
An Accurate Measure for Multilayer Perceptron Tolerance to Weight Deviations
Neural Processing Letters
Obtaining Fault Tolerant Multilayer Perceptrons Using an Explicit Regularization
Neural Processing Letters
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Time Series Analysis, Forecasting and Control
Time Series Analysis, Forecasting and Control
Comparison of adaptive methods for function estimation from samples
IEEE Transactions on Neural Networks
Estimations of error bounds for neural-network function approximators
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Analysis on Bidirectional Associative Memories with Multiplicative Weight Noise
Neural Information Processing
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Kernel Width Optimization for Faulty RBF Neural Networks with Multi-node Open Fault
Neural Processing Letters
Pareto-optimal noise and approximation properties of RBF networks
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Hi-index | 0.00 |
In previous work we have derived a magnitude termed the 'Mean Squared Sensitivity' (MSS) to predict the performance degradation of a MLP affected by perturbations in different parameters. The present Letter continues the same line of researching, applying a similar methodology to deal with RBF networks and to study the implications when they are affected by input noise. We obtain the corresponding analytical expression for MSS in RBF networks and validate it experimentally, using two different models for perturbations: an additive and a multiplicative model. We discuss the relationship between MSS and the generalization ability. MSS is proposed as a quantitative measurement to evaluate the noise immunity and generalization ability of a RBFN configuration, giving even more generalization to our approach.