Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Constrained K-means Clustering with Background Knowledge
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Semi-supervised learning with graphs
Semi-supervised learning with graphs
Semi-supervised nonlinear dimensionality reduction
ICML '06 Proceedings of the 23rd international conference on Machine learning
Active learning via transductive experimental design
ICML '06 Proceedings of the 23rd international conference on Machine learning
Feature Reduction via Generalized Uncorrelated Linear Discriminant Analysis
IEEE Transactions on Knowledge and Data Engineering
A unified framework for semi-supervised dimensionality reduction
Pattern Recognition
Unsupervised learning of image manifolds by semidefinite programming
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Hi-index | 0.00 |
Traditional unsupervised dimensionality reduction techniques are widely used in many learning tasks, such as text classification and face recognition. However, in many applications, a few labeled examples are readily available. Thus, semi-supervised dimensionality reduction( SSDR), which could incorporate the label information, has aroused considerable research interests. In this paper, a novel SSDR approach, which employs the harmonic function in a gaussian random field to compute the states of all points, is proposed. It constructs a complete weighted graph, whose edge weights are assigned by the computed states. The linear projection matrix is then derived to maximize the separation of points in different classes. For illustration, we provide some deep theoretical analyses and promising classification results on different kinds of data sets. Compared with other dimensionality reduction approaches, it is more beneficial for classification. Comparing with the transductive harmonic function method, it is inductive and able to deal with new coming data directly.