Incremental Gradient Algorithms with Stepsizes Bounded Away from Zero
Computational Optimization and Applications
The Incremental Gauss-Newton Algorithm with Adaptive Stepsize Rule
Computational Optimization and Applications
Evaluation relaxation using substructural information and linear estimation
Proceedings of the 8th annual conference on Genetic and evolutionary computation
A recursive algorithm for nonlinear least-squares problems
Computational Optimization and Applications
Fuzzy model validation using the local statistical approach
Fuzzy Sets and Systems
Reconstruction of volumetric surface textures for real-time rendering
EGSR'06 Proceedings of the 17th Eurographics conference on Rendering Techniques
Hi-index | 0.00 |
In this paper we propose and analyze nonlinear least squares methods which process the data incrementally, one data block at a time. Such methods are well suited for large data sets and real time operation and have received much attention in the context of neural network training problems. We focus on the extended Kalman filter, which may be viewed as an incremental version of the Gauss--Newton method. We provide a nonstochastic analysis of its convergence properties, and we discuss variants aimed at accelerating its convergence.