An Accurate Measure for Multilayer Perceptron Tolerance to Weight Deviations
Neural Processing Letters
Obtaining Fault Tolerant Multilayer Perceptrons Using an Explicit Regularization
Neural Processing Letters
Finite Precision Error Analysis of Neural Network Hardware Implementations
IEEE Transactions on Computers
Assessing the Noise Immunity and Generalization of Radial Basis Function Networks
Neural Processing Letters
Noise-Resistant Fitting for Spherical Harmonics
IEEE Transactions on Visualization and Computer Graphics
A smoothing regularizer for feedforward and recurrent neural networks
Neural Computation
Regularization in the selection of radial basis function centers
Neural Computation
Sparse modeling using orthogonal forward regression with PRESS statistic and regularization
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A hybrid neural network model for noisy data regression
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Asymptotic statistical theory of overtraining and cross-validation
IEEE Transactions on Neural Networks
Estimations of error bounds for neural-network function approximators
IEEE Transactions on Neural Networks
On the regularization of forgetting recursive least square
IEEE Transactions on Neural Networks
Distributed fault tolerance in optimal interpolative nets
IEEE Transactions on Neural Networks
Multiple model regression estimation
IEEE Transactions on Neural Networks
Generalized RLS approach to the training of neural networks
IEEE Transactions on Neural Networks
Computation of Adalines' sensitivity to weight perturbation
IEEE Transactions on Neural Networks
The Impact of Arithmetic Representation on Implementing MLP-BP on FPGAs: A Study
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
The selection of weight accuracies for Madalines
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Kernel Width Optimization for Faulty RBF Neural Networks with Multi-node Open Fault
Neural Processing Letters
Letters: Training RBF network to tolerate single node fault
Neurocomputing
Hi-index | 0.00 |
In this paper, an objective function for training a functional link network to tolerate multiplicative weight noise is presented. Basically, the objective function is similar in form to other regularizer-based functions that consist of a mean square training error term and a regularizer term. Our study shows that under some mild conditions the derived regularizer is essentially the same as a weight decay regularizer. This explains why applying weight decay can also improve the fault-tolerant ability of a radial basis function (RBF) with multiplicative weight noise. In accordance with the objective function, a simple learning algorithm for a functional link network with multiplicative weight noise is derived. Finally, the mean prediction error of the trained network is analyzed. Simulated experiments on two artificial data sets and a real-world application are performed to verify theoretical result.