Modified Hebbian learning for curve and surface fitting
Neural Networks
Document clustering based on non-negative matrix factorization
Proceedings of the 26th annual international ACM SIGIR conference on Research and development in informaion retrieval
Introducing a weighted non-negative matrix factorization for image classification
Pattern Recognition Letters
Nonnegative features of spectro-temporal sounds for classification
Pattern Recognition Letters
A Unifying Approach to Hard and Probabilistic Clustering
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
A Generalized Divergence Measure for Nonnegative Matrix Factorization
Neural Computation
Learning Image Components for Object Recognition
The Journal of Machine Learning Research
Non-negative matrix factorization with α-divergence
Pattern Recognition Letters
Stability and Chaos of a Class of Learning Algorithms for ICA Neural Networks
Neural Processing Letters
IEEE Transactions on Signal Processing
Document clustering using nonnegative matrix factorization
Information Processing and Management: an International Journal
Convergence Analysis of Non-Negative Matrix Factorization for BSS Algorithm
Neural Processing Letters
IEEE Transactions on Neural Networks
Nonnegative matrix factorization for motor imagery EEG classification
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
Csiszár’s divergences for non-negative matrix factorization: family of new algorithms
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
On the Convergence of Multiplicative Update Algorithms for Nonnegative Matrix Factorization
IEEE Transactions on Neural Networks
NMF-based environmental sound source separation using time-variant gain features
Computers & Mathematics with Applications
Global Minima Analysis of Lee and Seung's NMF Algorithms
Neural Processing Letters
Hi-index | 0.09 |
In this paper, the multistability of a class of Amari's @a-divergence based nonnegative matrix factorization learning algorithms is analyzed. The analysis results show that invariant sets for the update algorithms can be constructed. In these invariant sets, the non-convergence of the discussed algorithms can be guaranteed. Based on Lyapunov's stability theorem, the local convergence of this class of learning algorithms is proved in the domain of their update rules. In the simulation, the analysis results are applied to image representation. Experiment results demonstrate that selecting suitable initial data for different applications of these nonnegative matrix factorization algorithms is very important.