Modified Hebbian learning for curve and surface fitting
Neural Networks
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Decomposition of quantics in sums of powers of linear forms
Signal Processing - Special issue on higher order statistics
Independent component analysis: theory and applications
Independent component analysis: theory and applications
Blind separation of positive sources by globally convergent gradient search
Neural Computation
Non-negative Matrix Factorization with Sparseness Constraints
The Journal of Machine Learning Research
Csiszár’s divergences for non-negative matrix factorization: family of new algorithms
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Algorithms for nonnegative independent component analysis
IEEE Transactions on Neural Networks
A "nonnegative PCA" algorithm for independent component analysis
IEEE Transactions on Neural Networks
Multistability of α-divergence based NMF algorithms
Computers & Mathematics with Applications
Global Minima Analysis of Lee and Seung's NMF Algorithms
Neural Processing Letters
Hi-index | 0.00 |
In this paper the convergence of a recently proposed BSS algorithm is analyzed. This algorithm utilized Kullback---Leibler divergence to generate non-negative matrix factorizations of the observation vectors, which is considered an important aspect of the BSS algorithm. In the analysis some invariant sets are constructed so that the convergence of the algorithm can be guaranteed in the given conditions. In the simulation we successfully applied the algorithm and its analysis results to the blind source separation of mixed images and signals.