Face Recognition Using Kernel UDP
Neural Processing Letters
Extracting non-negative basis images using pixel dispersion penalty
Pattern Recognition
Interactive cartoon reusing by transfer learning
Signal Processing
Image classification by multimodal subspace learning
Pattern Recognition Letters
Discriminative information preservation for face recognition
Neurocomputing
Sparse transfer learning for interactive video search reranking
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
A probabilistic model for image representation via multiple patterns
Pattern Recognition
Query difficulty estimation for image retrieval
Neurocomputing
Integrating Spectral Kernel Learning and Constraints in Semi-Supervised Classification
Neural Processing Letters
Joint geometry and variability for image recognition
Neurocomputing
Accelerating locality preserving nonnegative matrix factorization
Proceedings of the 21st ACM international conference on Information and knowledge management
Fast reduction of speckle noise in real ultrasound images
Signal Processing
Multiple kernel local Fisher discriminant analysis for face recognition
Signal Processing
Cartoon features selection using Diffusion Score
Signal Processing
Ordinal regularized manifold feature extraction for image ranking
Signal Processing
Low-rank quadratic semidefinite programming
Neurocomputing
Measuring the degree of face familiarity based on extended NMF
ACM Transactions on Applied Perception (TAP)
Hi-index | 0.01 |
Nonnegative matrix factorization (NMF) has become a popular data-representation method and has been widely used in image processing and pattern-recognition problems. This is because the learned bases can be interpreted as a natural parts-based representation of data and this interpretation is consistent with the psychological intuition of combining parts to form a whole. For practical classification tasks, however, NMF ignores both the local geometry of data and the discriminative information of different classes. In addition, existing research results show that the learned basis is unnecessarily parts-based because there is neither explicit nor implicit constraint to ensure the representation parts-based. In this paper, we introduce the manifold regularization and the margin maximization to NMF and obtain the manifold regularized discriminative NMF (MD-NMF) to overcome the aforementioned problems. The multiplicative update rule (MUR) can be applied to optimizing MD-NMF, but it converges slowly. In this paper, we propose a fast gradient descent (FGD) to optimize MD-NMF. FGD contains a Newton method that searches the optimal step length, and thus, FGD converges much faster than MUR. In addition, FGD includes MUR as a special case and can be applied to optimizing NMF and its variants. For a problem with 165 samples in R1600 , FGD converges in 28 s, while MUR requires 282 s. We also apply FGD in a variant of MD-NMF and experimental results confirm its efficiency. Experimental results on several face image datasets suggest the effectiveness of MD-NMF.