The Geometry of Algorithms with Orthogonality Constraints
SIAM Journal on Matrix Analysis and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
A Theory for Learning by Weight Flow on Stiefel-Grassman Manifold
Neural Computation
Equivariant adaptive source separation
IEEE Transactions on Signal Processing
Algorithms for nonnegative independent component analysis
IEEE Transactions on Neural Networks
A "nonnegative PCA" algorithm for independent component analysis
IEEE Transactions on Neural Networks
Non-negative Matrix Factorization with Sparseness Constraints
The Journal of Machine Learning Research
Blind Separation of Positive Signals by Using Genetic Algorithm
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks, Part III
A fixed-point algorithm for blind source separation with nonlinear autocorrelation
Journal of Computational and Applied Mathematics
Fast nonlinear autocorrelation algorithm for source separation
Pattern Recognition
Convergence Analysis of Non-Negative Matrix Factorization for BSS Algorithm
Neural Processing Letters
Hierarchical ALS algorithms for nonnegative matrix and 3D tensor factorization
ICA'07 Proceedings of the 7th international conference on Independent component analysis and signal separation
Semi-nonnegative independent component analysis: the (3,4)-SENICAexpmethod
LVA/ICA'10 Proceedings of the 9th international conference on Latent variable analysis and signal separation
PRIB'06 Proceedings of the 2006 international conference on Pattern Recognition in Bioinformatics
Monotonic convergence of a nonnegative ICA algorithm on stiefel manifold
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Global Minima Analysis of Lee and Seung's NMF Algorithms
Neural Processing Letters
The CAM software for nonnegative blind source separation in R-Java
The Journal of Machine Learning Research
Hi-index | 0.00 |
The instantaneous noise-free linear mixing model in independent component analysis is largely a solved problem under the usual assumption of independent nongaussian sources and full column rank mixing matrix. However, with some prior information on the sources, like positivity, new analysis and perhaps simplified solution methods may yet become possible. In this letter, we consider the task of independent component analysis when the independent sources are known to be nonnegative and well grounded, which means that they have a nonzero pdf in the region of zero. It can be shown that in this case, the solution method is basically very simple: an orthogonal rotation of the whitened observation vector into nonnegative outputs will give a positive permutation of the original sources. We propose a cost function whose minimum coincides with nonnegativity and derive the gradient algorithm under the whitening constraint, under which the separating matrix is orthogonal. We further prove that in the Stiefel manifold of orthogonal matrices, the cost function is a Lyapunov function for the matrix gradient flow, implying global convergence. Thus, this algorithm is guaranteed to find the nonnegative well-grounded independent sources. The analysis is complemented by a numerical simulation, which illustrates the algorithm.