Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Face Recognition Using Laplacianfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Graph Embedding: A General Framework for Dimensionality Reduction
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Local Discriminant Embedding and Its Variants
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Local Fisher discriminant analysis for supervised dimensionality reduction
ICML '06 Proceedings of the 23rd international conference on Machine learning
Stepwise nearest neighbor discriminant analysis
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Extracting the optimal dimensionality for local tensor discriminant analysis
Pattern Recognition
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Learning instance specific distances using metric propagation
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Multilinear Tensor-Based Non-parametric Dimension Reduction for Gait Recognition
ICB '09 Proceedings of the Third International Conference on Advances in Biometrics
Low-resolution gait recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on gait analysis
Towards the Optimal Discriminant Subspace
WI-IAT '12 Proceedings of the The 2012 IEEE/WIC/ACM International Joint Conferences on Web Intelligence and Intelligent Agent Technology - Volume 01
Exploiting fisher and fukunaga-koontz transforms in chernoff dimensionality reduction
ACM Transactions on Knowledge Discovery from Data (TKDD)
Hi-index | 0.00 |
In many real-world applications, Euclidean distance in the original space is not good due to the curse of dimensionality. In this paper, we propose a new method, called Discriminant Neighborhood Embedding (DNE), to learn an appropriate metric space for classification given finite training samples. We define a discriminant adjacent matrix in favor of classification task, i.e., neighboring samples in the same class are squeezed but those in different classes are separated as far as possible. The optimal dimensionality of the metric space can be estimated by spectral analysis in the proposed method, which is of great significance for high-dimensional patterns. Experiments with various datasets demonstrate the effectiveness of our method.