Training multilayer perceptrons with the extended Kalman algorithm
Advances in neural information processing systems 1
IEEE Transactions on Pattern Analysis and Machine Intelligence
A course in fuzzy systems and control
A course in fuzzy systems and control
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Neural Networks Training with Optimal Bounded Ellipsoid Algorithm
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks
Robust minimum variance beamforming
IEEE Transactions on Signal Processing
Multiweight optimization in optimal bounding ellipsoid algorithms
IEEE Transactions on Signal Processing
Fuzzy function approximation with ellipsoidal rules
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An ellipsoidal calculus based on propagation and fusion
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An algorithmic approach to adaptive state filtering using recurrent neural networks
IEEE Transactions on Neural Networks
High-order neural network structures for identification of dynamical systems
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Compared to normal learning algorithms, for example backpropagation, the optimal bounded ellipsoid (OBE) algorithm has some better properties, such as faster convergence, since it has a similar structure as Kalman filter. OBE has some advantages over Kalman filter training, the noise is not required to be Guassian. In this paper OBE algorithm is applied traing the weights of the feedforward neural network for nonlinear system identification. Both hidden layers and output layers can be updated. From a dynamic systems point of view, such training can be useful for all neural network applications requiring real-time updating of the weights. Two simulations give the effectiveness of the suggested algorithm.