Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
The CMU Pose, Illumination, and Expression (PIE) Database
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Graph Embedding: A General Framework for Dimensionality Reduction
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Local Discriminant Embedding and Its Variants
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Local Fisher discriminant analysis for supervised dimensionality reduction
ICML '06 Proceedings of the 23rd international conference on Machine learning
Locality preserving projections
Locality preserving projections
Discriminant neighborhood embedding for classification
Pattern Recognition
Optimal dimensionality of metric space for classification
Proceedings of the 24th international conference on Machine learning
Geometric Mean for Subspace Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Patch Alignment for Dimensionality Reduction
IEEE Transactions on Knowledge and Data Engineering
Learning distance functions for image retrieval
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Max-Min Distance Analysis by Using Sequential SDP Relaxation for Dimension Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Manifold elastic net: a unified framework for sparse dimension reduction
Data Mining and Knowledge Discovery
Non-Negative Patch Alignment Framework
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Dimensionality reduction is a common practice in many learning and Intelligence applications. However, most existing methods use the dimension of the target subspace as a parameter, making it hard to decide which subspace is ptimal for classification. In this paper, we address the challenge of learning the optimal subspace for the nearest neighbor classification. We focus on labeled data and assume that the data for each class lie on respective sub-manifolds. To separate each sub-manifold, the labels of the data are used to learn the subspace where neighboring points of the same class keep close and those of different classes are disassociated. The sub-manifold separating method is first proposed as linear projection. For more complicated nonlinear situation, we generalize the algorithm using the kernel method. A group of experiments on data representation and classification are performed to evaluate he effectiveness of the proposed approaches.