From Few to Many: Illumination Cone Models for Face Recognition under Variable Lighting and Pose
IEEE Transactions on Pattern Analysis and Machine Intelligence
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Acquiring Linear Subspaces for Face Recognition under Variable Lighting
IEEE Transactions on Pattern Analysis and Machine Intelligence
Semi-Supervised Classification Using Linear Neighborhood Propagation
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Label Propagation through Linear Neighborhoods
IEEE Transactions on Knowledge and Data Engineering
Local Sparse Representation Based Classification
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
Hi-index | 0.00 |
Graph-based Semi-Supervised Learning (SSL) has been an active topic in machine learning for about a decade. It is well-known that how to construct the graph is the central concern in recent work since an efficient graph structure can significantly boost the final performance. In this paper, we present a review on several different graphs for graph-based SSL at first. And then, we conduct a series of experiments on benchmark data sets in order to give a comprehensive evaluation on the advantageous and shortcomings for each of them. Experimental results shown that: a) when data lie on independent subspaces and the number of labeled data is enough, the low-rank representation based method performs best, and b) in the majority cases, the local sparse representation based method performs best, especially when the number of labeled data is few.