Margin based feature selection - theory and algorithms
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Face Recognition Using Laplacianfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Iterative RELIEF for feature weighting
ICML '06 Proceedings of the 23rd international conference on Machine learning
Discriminant neighborhood embedding for classification
Pattern Recognition
Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis
The Journal of Machine Learning Research
A unified framework for semi-supervised dimensionality reduction
Pattern Recognition
Semi-supervised sub-manifold discriminant analysis
Pattern Recognition Letters
Semi-supervised Discriminant Analysis Via CCCP
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
A Graph Based Subspace Semi-supervised Learning Framework for Dimensionality Reduction
ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part II
Hi-index | 0.01 |
Most manifold learning algorithms adopt the k nearest neighbors function to construct the adjacency graph. However, severe bias may be introduced in this case if the samples are not uniformly distributed in the ambient space. In this paper a semi-supervised dimensionality reduction method is proposed to alleviate this problem. Based on the notion of local margin, we simultaneously maximize the separability between different classes and estimate the intrinsic geometric structure of the data by both the labeled and unlabeled samples. For high-dimensional data, a discriminant subspace is derived via maximizing the cumulative local margins. Experimental results on high-dimensional classification tasks demonstrate the efficacy of our algorithm.