Modified Hebbian learning for curve and surface fitting
Neural Networks
Development and analysis of a neural network approach toPisarenko's harmonic retrieval method
IEEE Transactions on Signal Processing
Convergence analysis of a deterministic discrete time system of feng's MCA learning algorithm
IEEE Transactions on Signal Processing
Algorithms for accelerated convergence of adaptive PCA
IEEE Transactions on Neural Networks
The MCA EXIN neuron for the minor component analysis
IEEE Transactions on Neural Networks
On the discrete-time dynamics of the basic Hebbian neural network node
IEEE Transactions on Neural Networks
Convergence analysis of a deterministic discrete time system of Oja's PCA learning algorithm
IEEE Transactions on Neural Networks
A unified learning algorithm to extract principal and minor components
Digital Signal Processing
A Self-Stabilizing Neural Algorithm for Total Least Squares Filtering
Neural Processing Letters
On the discrete-time dynamics of a class of self-stabilizing MCA extraction algorithms
IEEE Transactions on Neural Networks
A self-stabilizing MSA algorithm in high-dimension data stream
Neural Networks
Hi-index | 0.98 |
The stability of minor component analysis (MCA) learning algorithms is an important problem in many signal processing applications. In this paper, we propose an effective MCA learning algorithm that can offer better stability. The dynamics of the proposed algorithm are analyzed via a corresponding deterministic discrete time (DDT) system. It is proven that if the learning rate satisfies some mild conditions, almost all trajectories of the DDT system starting from points in an invariant set are bounded, and will converge to the minor component of the autocorrelation matrix of the input data. Simulation results will be furnished to illustrate the theoretical results achieved.