Convex Optimization
Email Surveillance Using Non-negative Matrix Factorization
Computational & Mathematical Organization Theory
Multiplicative Updates for Nonnegative Quadratic Programming
Neural Computation
Projected Gradient Methods for Nonnegative Matrix Factorization
Neural Computation
Robust watermarking based on DWT and nonnegative matrix factorization
Computers and Electrical Engineering
Document clustering using nonnegative matrix factorization
Information Processing and Management: an International Journal
Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation
On the Complexity of Nonnegative Matrix Factorization
SIAM Journal on Optimization
IEEE Transactions on Neural Networks
Graph Regularized Nonnegative Matrix Factorization for Data Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
Musical Genre Classification Using Nonnegative Matrix Factorization-Based Features
IEEE Transactions on Audio, Speech, and Language Processing
Global Convergence of Decomposition Learning Methods for Support Vector Machines
IEEE Transactions on Neural Networks
On the Convergence of Multiplicative Update Algorithms for Nonnegative Matrix Factorization
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks - Part 1
Hi-index | 0.00 |
Nonnegative matrix factorization (NMF) is the problem of approximating a given nonnegative matrix by the product of two nonnegative matrices. The multiplicative updates proposed by Lee and Seung are widely used as efficient computational methods for NMF. However, the global convergence of these updates is not formally guaranteed because they are not defined for all pairs of nonnegative matrices. In this paper, we consider slightly modified versions of the original multiplicative updates and study their global convergence properties. The only difference between the modified updates and the original ones is that the former do not allow variables to take values less than a user-specified positive constant. Using Zangwill's global convergence theorem, we prove that any sequence of solutions generated by either of those modified updates has at least one convergent subsequence and the limit of any convergent subsequence is a stationary point of the corresponding optimization problem. Furthermore, we propose algorithms based on the modified updates that always stop within a finite number of iterations.