FNN (Feedforward Neural Network) Training Method Based on Robust Recursive Least Square Method
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Part II--Advances in Neural Networks
Robust recursive complex extreme learning machine algorithm for finite numerical precision
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
A prewhitening RLS projection alternated subspace tracking (PAST) algorithm
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
We derive two new O(N/sup 2/) algorithms for arbitrary recursive least-squares (RLS) estimation tasks. The algorithms employ a novel update for an inverse square-root factor of the exponentially-windowed input signal autocorrelation matrix that is the least-squares equivalent of a natural gradient prewhitening algorithm. Both of the new RLS algorithms require 4N/sup 2/+O(N) multiply/adds, two divides, and one square root per iteration to implement. We can prove that our new algorithms are numerically-robust, and simulations are used to indicate this fact in fixed-point arithmetic. An algorithm that computes the square-root factorization of the input signal autocorrelation matrix is also described.