Digital image processing
Modified Hebbian learning for curve and surface fitting
Neural Networks
Asymptotic convergence analysis of the projection approximation subspace tracking algorithms
Signal Processing - Special issue on subspace methods, part I: array signal processing and subspace computations
A new adaptive algorithm for minor component analysis
Signal Processing
Optimization by Vector Space Methods
Optimization by Vector Space Methods
On a Class of Orthonormal Algorithms for Principal and Minor Subspace Tracking
Journal of VLSI Signal Processing Systems
An Efficient Adaptive Minor Subspace Extraction Using Exact Nested Orthogonal Complement Structure
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
On the discrete-time dynamics of a class of self-stabilizing MCA extraction algorithms
IEEE Transactions on Neural Networks
Projection approximation subspace tracking
IEEE Transactions on Signal Processing
Total least mean squares algorithm
IEEE Transactions on Signal Processing
Development and analysis of a neural network approach toPisarenko's harmonic retrieval method
IEEE Transactions on Signal Processing
An Approximate Inverse-Power Algorithm for Adaptive Extraction of Minor Subspace
IEEE Transactions on Signal Processing - Part II
Convergence analysis of a deterministic discrete time system of feng's MCA learning algorithm
IEEE Transactions on Signal Processing
On the discrete-time dynamics of the basic Hebbian neural network node
IEEE Transactions on Neural Networks
Convergence analysis of a deterministic discrete time system of Oja's PCA learning algorithm
IEEE Transactions on Neural Networks
Matrix Analysis
Multidimensional Systems and Signal Processing
Hi-index | 0.08 |
We present a unified convergence analysis, based on a deterministic discrete time (DDT) approach, of the normalized projection approximation subspace tracking (Normalized PAST) algorithms for estimating principal and minor components of an input signal. The proposed analysis shows that the DDT system of the Normalized PAST algorithm (for PCA/MCA), with any forgetting factor in a certain range, converges to a desired eigenvector. This eigenvector is completely characterized as the normalized version of the orthogonal projection of the initial estimate onto the eigensubspace corresponding to the largest/smallest eigenvalue of the autocorrelation matrix of the input signal. This characterization holds in general case where the eigenvalues are not necessarily distinct. Numerical examples show that the proposed analysis demonstrates very well the convergence behavior of the Normalized PAST algorithms which uses a rank-1 instantaneous approximation of the autocorrelation matrix.