Modified Hebbian learning for curve and surface fitting
Neural Networks
Total least mean squares algorithm
IEEE Transactions on Signal Processing
Development and analysis of a neural network approach toPisarenko's harmonic retrieval method
IEEE Transactions on Signal Processing
Adaptive minor component extraction with modular structure
IEEE Transactions on Signal Processing
Convergence analysis of a deterministic discrete time system of feng's MCA learning algorithm
IEEE Transactions on Signal Processing
Against the convergence of the minor component analysis neurons
IEEE Transactions on Neural Networks
A class of learning algorithms for principal component analysis and minor component analysis
IEEE Transactions on Neural Networks
Algorithms for accelerated convergence of adaptive PCA
IEEE Transactions on Neural Networks
The MCA EXIN neuron for the minor component analysis
IEEE Transactions on Neural Networks
On the discrete-time dynamics of the basic Hebbian neural network node
IEEE Transactions on Neural Networks
Neural network learning algorithms for tracking minor subspace in high-dimensional data stream
IEEE Transactions on Neural Networks
Convergence analysis of a deterministic discrete time system of Oja's PCA learning algorithm
IEEE Transactions on Neural Networks
A Self-Stabilizing Neural Algorithm for Total Least Squares Filtering
Neural Processing Letters
Incremental slow feature analysis
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Hi-index | 0.00 |
Minor component analysis (MCA) is a powerful statistical tool for signal processing and data analysis. Convergence of MCA learning algorithms is an important issue in practical applications. In this paper, we will propose a simple MCA learning algorithm to extract minor component from input signals. Dynamics of the proposed MCA learning algorithm are analysed using a corresponding deterministic discrete time (DDT) system. It is proved that almost all trajectories of the DDT system will converge to minor component if the learning rate satisfies some mild conditions and the trajectories start from points in an invariant set. Simulation results will be furnished to illustrate the theoretical results achieved.