Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Matrix computations (3rd ed.)
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Semi-supervised sub-manifold discriminant analysis
Pattern Recognition Letters
Extracting the optimal dimensionality for local tensor discriminant analysis
Pattern Recognition
Semi-supervised orthogonal discriminant analysis via label propagation
Pattern Recognition
Trace ratio criterion for feature selection
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Retrieval based interactive cartoon synthesis via unsupervised bi-distance metric learning
MM '09 Proceedings of the 17th ACM international conference on Multimedia
Laplacian Discriminant Projection Based on Affinity Propagation
AICI '09 Proceedings of the International Conference on Artificial Intelligence and Computational Intelligence
Exemplar based Laplacian Discriminant Projection
Expert Systems with Applications: An International Journal
Analysis of Correlation Based Dimension Reduction Methods
International Journal of Applied Mathematics and Computer Science - Issues in Advanced Control and Diagnosis
Fast neighborhood component analysis
Neurocomputing
Incremental complete LDA for face recognition
Pattern Recognition
Exemplar based laplacian discriminant projection
ICSI'10 Proceedings of the First international conference on Advances in Swarm Intelligence - Volume Part II
Graph embedding based feature selection
Neurocomputing
Biview face recognition in the shape-texture domain
Pattern Recognition
Hi-index | 0.00 |
A new algorithm, Neighborhood MinMax Projections (NMMP), is proposed for supervised dimensionality reduction in this paper. The algorithm aims at learning a linear transformation, and focuses only on the pairwise points where the two points are neighbors of each other. After the transformation, the considered pairwise points within the same class are as close as possible, while those between different classes are as far as possible. We formulate this problem as a constrained optimization problem, in which the global optimum can be effectively and efficiently obtained. Compared with the popular supervised method, Linear Discriminant Analysis (LDA), our method has three significant advantages. First, it is able to extract more discriminative features. Second, it can deal with the case where the class distributions aremore complex than Gaussian. Third, the singularity problem existing in LDA does not occur naturally. The performance on several data sets demonstrates the effectiveness of the proposed method.