The cascade-correlation learning architecture
Advances in neural information processing systems 2
Advances in neural information processing systems 2
Neural networks and the bias/variance dilemma
Neural Computation
A practical Bayesian framework for backpropagation networks
Neural Computation
Neural networks and dynamical systems
International Journal of Approximate Reasoning - Special issue on fuzzy logic and neural networks for pattern recognition and control
Obtaining Fault Tolerant Multilayer Perceptrons Using an Explicit Regularization
Neural Processing Letters
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Regularization in the selection of radial basis function centers
Neural Computation
Design of artificial neural networks using a modified particle swarm optimization algorithm
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
IEEE Transactions on Neural Networks
Design of artificial neural networks using differential evolution algorithm
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: models and applications - Volume Part II
Sparse modeling using orthogonal forward regression with PRESS statistic and regularization
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A hybrid neural network model for noisy data regression
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Asymptotic statistical theory of overtraining and cross-validation
IEEE Transactions on Neural Networks
Synthesis of fault-tolerant feedforward neural networks using minimax optimization
IEEE Transactions on Neural Networks
On the regularization of forgetting recursive least square
IEEE Transactions on Neural Networks
Maximally fault tolerant neural networks
IEEE Transactions on Neural Networks
Generalized RLS approach to the training of neural networks
IEEE Transactions on Neural Networks
The Impact of Arithmetic Representation on Implementing MLP-BP on FPGAs: A Study
IEEE Transactions on Neural Networks
A Fault-Tolerant Regularizer for RBF Networks
IEEE Transactions on Neural Networks
Symmetric RBF Classifier for Nonlinear Detection in Multiple-Antenna-Aided Systems
IEEE Transactions on Neural Networks
Enhanced MLP performance and fault tolerance resulting from synaptic weight noise during training
IEEE Transactions on Neural Networks
Complete and partial fault tolerance of feedforward neural nets
IEEE Transactions on Neural Networks
Fault-tolerant training for optimal interpolative nets
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
In neural network training, adding a regularization term into the objective function is an effective method to improve the generalization ability and fault tolerance. Recently, an open node fault regularizer (ONFR) approach was proposed to train radial basis function (RBF) networks. However, this approach only aims at minimizing the training set error of the trained network under the open node fault situation. This paper studies the generalization ability of faulty RBF networks. We derive a formula to predict the generalization ability of faulty RBF networks. With this formula, we are able to predict the generalization ability of faulty RBF networks without using a test set or generating a large number of potential faulty networks. Based on the formula, we then develop an algorithm to optimize the regularization parameter and RBF width.