Partitioning sparse matrices with eigenvectors of graphs
SIAM Journal on Matrix Analysis and Applications
Exponentiated gradient versus gradient descent for linear predictors
Information and Computation
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Document clustering based on non-negative matrix factorization
Proceedings of the 26th annual international ACM SIGIR conference on Research and development in informaion retrieval
Local Non-Negative Matrix Factorization as a Visual Representation
ICDL '02 Proceedings of the 2nd International Conference on Development and Learning
Multiclass Spectral Clustering
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Kernel k-means: spectral clustering and normalized cuts
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
K-means clustering via principal component analysis
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Non-negative Matrix Factorization with Sparseness Constraints
The Journal of Machine Learning Research
Orthogonal nonnegative matrix t-factorizations for clustering
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
The Relationships Among Various Nonnegative Matrix Factorization Methods for Clustering
ICDM '06 Proceedings of the Sixth International Conference on Data Mining
Projected Gradient Methods for Nonnegative Matrix Factorization
Neural Computation
Multiplicative updates for non-negative projections
Neurocomputing
Convex and Semi-Nonnegative Matrix Factorizations
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neurocomputing
Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation
Projective nonnegative matrix factorization for image compression and feature extraction
SCIA'05 Proceedings of the 14th Scandinavian conference on Image Analysis
Automatic rank determination in projective nonnegative matrix factorization
LVA/ICA'10 Proceedings of the 9th international conference on Latent variable analysis and signal separation
Quadratic nonnegative matrix factorization
Pattern Recognition
Subclass discriminant Nonnegative Matrix Factorization for facial image analysis
Pattern Recognition
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part I
Adaptive multiplicative updates for projective nonnegative matrix factorization
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part III
Online projective nonnegative matrix factorization for large datasets
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part III
Nonnegative matrix factorization on orthogonal subspace with smoothed l0 norm constrained
IScIDE'12 Proceedings of the third Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering
Locality preserving non-negative basis learning with graph embedding
IPMI'13 Proceedings of the 23rd international conference on Information Processing in Medical Imaging
Hi-index | 0.00 |
A variant of nonnegative matrix factorization (NMF) which was proposed earlier is analyzed here. It is called projective nonnegative matrix factorization (PNMF). The new method approximately factorizes a projection matrix, minimizing the reconstruction error, into a positive low-rank matrix and its transpose. The dissimilarity between the original data matrix and its approximation can be measured by the Frobenius matrix norm or the modified Kullback-Leibler divergence. Both measures are minimized by multiplicative update rules, whose convergence is proven for the first time. Enforcing orthonormality to the basic objective is shown to lead to an even more efficient update rule, which is also readily extended to nonlinear cases. The formulation of the PNMF objective is shown to be connected to a variety of existing NMF methods and clustering approaches. In addition, the derivation using Lagrangian multipliers reveals the relation between reconstruction and sparseness. For kernel principal component analysis (PCA) with the binary constraint, useful in graph partitioning problems, the nonlinear kernel PNMF provides a good approximation which outperforms an existing discretization approach. Empirical study on three real-world databases shows that PNMF can achieve the best or close to the best in clustering. The proposed algorithm runs more efficiently than the compared NMF methods, especially for high-dimensional data. Moreover, contrary to the basic NMF, the trained projection matrix can be readily used for newly coming samples and demonstrates good generalization.