Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Training multilayer perceptrons with the extended Kalman algorithm
Advances in neural information processing systems 1
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Universal approximation using radial-basis-function networks
Neural Computation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Incremental Least Squares Methods and the Extended Kalman Filter
SIAM Journal on Optimization
The Incremental Gauss-Newton Algorithm with Adaptive Stepsize Rule
Computational Optimization and Applications
Learning with generalization capability by kernal methods of bounded complexity
Journal of Complexity
A theoretical comparison of batch-mode, on-line, cyclic, and almost-cyclic learning
IEEE Transactions on Neural Networks
H∞-learning of layered neural networks
IEEE Transactions on Neural Networks
Two regularizers for recursive least squared algorithms in feedforward multilayered neural networks
IEEE Transactions on Neural Networks
Optimization-based learning with bounded error for feedforward neural networks
IEEE Transactions on Neural Networks
Computational Optimization and Applications
Hi-index | 0.00 |
The solution of nonlinear least-squares problems is investigated. The asymptotic behavior is studied and conditions for convergence are derived. To deal with such problems in a recursive and efficient way, it is proposed an algorithm that is based on a modified extended Kalman filter (MEKF). The error of the MEKF algorithm is proved to be exponentially bounded. Batch and iterated versions of the algorithm are given, too. As an application, the algorithm is used to optimize the parameters in certain nonlinear input---output mappings. Simulation results on interpolation of real data and prediction of chaotic time series are shown.