Letters: Prediction error of a fault tolerant neural network
Neurocomputing
IEEE Transactions on Neural Networks
A growing and pruning method for radial basis function networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Kernel Width Optimization for Faulty RBF Neural Networks with Multi-node Open Fault
Neural Processing Letters
On the selection of weight decay parameter for faulty networks
IEEE Transactions on Neural Networks
Generalization error of faulty MLPs with weight decay regularizer
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: models and applications - Volume Part II
Prediction error of a fault tolerant neural network
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Hi-index | 0.00 |
The regularization of employing the forgetting recursive least square (FRLS) training technique on feedforward neural networks is studied. We derive our result from the corresponding equations for the expected prediction error and the expected training error. By comparing these error equations with other equations obtained previously from the weight decay method, we have found that the FRLS technique has an effect which is identical to that of using the simple weight decay method. This new finding suggests that the FRLS technique is another online approach for the realization of the weight decay effect. Besides, we have shown that, under certain conditions, both the model complexity and the expected prediction error of the model being trained by the FRLS technique are better than the one trained by the standard RLS method