Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Independent component analysis: algorithms and applications
Neural Networks
Non-negative Matrix Factorization for Face Recognition
CCIA '02 Proceedings of the 5th Catalonian Conference on AI: Topics in Artificial Intelligence
Local Non-Negative Matrix Factorization as a Visual Representation
ICDL '02 Proceedings of the 2nd International Conference on Development and Learning
Face Recognition Using Laplacianfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Non-negative Matrix Factorization with Sparseness Constraints
The Journal of Machine Learning Research
Nonsmooth Nonnegative Matrix Factorization (nsNMF)
IEEE Transactions on Pattern Analysis and Machine Intelligence
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
ICA-based neighborhood preserving analysis for face recognition
Computer Vision and Image Understanding
Distance Metric Learning for Large Margin Nearest Neighbor Classification
The Journal of Machine Learning Research
Tumor clustering using nonnegative matrix factorization with gene selection
IEEE Transactions on Information Technology in Biomedicine - Special section on biomedical informatics
Projective nonnegative graph embedding
IEEE Transactions on Image Processing
Overview and recent advances in partial least squares
SLSFS'05 Proceedings of the 2005 international conference on Subspace, Latent Structure and Feature Selection
IEEE Transactions on Information Forensics and Security - Part 2
Face recognition by independent component analysis
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.10 |
In this paper, we present a new method, called large margin based nonnegative matrix factorization (LMNMF), to encode latent discriminant information in training data. LMNMF seeks a nonnegative subspace such that k nearest neighbors of each sample always belong to same class and samples from different classes are separated by a large margin. In the subspace, the local separation structure of data is explicit. The large-margin criterion leads to a new objective function, and a convergency provable multiplicative nonnegative updating rule is derived to learn the basis matrix and encoding vectors. Then, partial least squares regression (PLSR) learns the mapping from the original data to low dimensional representations in order to capture local separation information. PLSR offers a unified solution to out-of-sample extension problem. Extensive experimental results demonstrate LMNMF with PLSR leads significant improvements on classification than several other commonly used NMF-based algorithms.