Nonlinear System Identification Using Lyapunov Based Fully Tuned Dynamic RBF Networks
Neural Processing Letters
Training fuzzy systems with the extended Kalman filter
Fuzzy Sets and Systems - Fuzzy systems
The Local True Weight Decay Recursive Least Square Algorithm
Neural Information Processing
Nonlinear Bayesian Filters for Training Recurrent Neural Networks
MICAI '08 Proceedings of the 7th Mexican International Conference on Artificial Intelligence: Advances in Artificial Intelligence
Nonlinear time series online prediction using reservoir Kalman filter
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Engineering Applications of Artificial Intelligence
On h∞ filtering in feedforward neural networks training and pruning
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
In the use of the extended Kalman filter approach in training and pruning a feedforward neural network, one usually encounters the problems of how to set the initial condition and how to use the result obtained to prune a neural network. In this paper, some cues on the setting of the initial condition are presented with a simple example illustrated. Then based on three assumptions: 1) the size of training set is large enough; 2) the training is able to converge; and 3) the trained network model is close to the actual one, an elegant equation linking the error sensitivity measure (the saliency) and the result obtained via an extended Kalman filter is devised. The validity of the devised equation is then testified by a simulated example