Face recognition: the problem of compensating for changes in illumination direction
ECCV '94 Proceedings of the third European conference on Computer vision (vol. 1)
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Face Recognition Using Laplacianfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Local Discriminant Embedding and Its Variants
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Neighborhood Preserving Embedding
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Label propagation through linear neighborhoods
ICML '06 Proceedings of the 23rd international conference on Machine learning
Semi-supervised nonlinear dimensionality reduction
ICML '06 Proceedings of the 23rd international conference on Machine learning
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Face recognition from a single image per person: A survey
Pattern Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Spectral clustering and transductive learning with multiple views
Proceedings of the 24th international conference on Machine learning
Graph Laplacians and their Convergence on Random Neighborhood Graphs
The Journal of Machine Learning Research
Regularized locality preserving indexing via spectral regression
Proceedings of the sixteenth ACM conference on Conference on information and knowledge management
Locally adaptive subspace and similarity metric learning for visual data clustering and retrieval
Computer Vision and Image Understanding
IEEE Transactions on Pattern Analysis and Machine Intelligence
Spectral Regression: A Unified Approach for Sparse Subspace Learning
ICDM '07 Proceedings of the 2007 Seventh IEEE International Conference on Data Mining
Geometric Mean for Subspace Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fitting a graph to vector data
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Graph construction and b-matching for semi-supervised learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Patch Alignment for Dimensionality Reduction
IEEE Transactions on Knowledge and Data Engineering
Semi-supervised learning by mixed label propagation
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
Locality sensitive discriminant analysis
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Probabilistic subgraph matching based on convex relaxation
EMMCVPR'05 Proceedings of the 5th international conference on Energy Minimization Methods in Computer Vision and Pattern Recognition
Orthogonal Laplacianfaces for Face Recognition
IEEE Transactions on Image Processing
Bayesian Tensor Approach for 3-D Face Modeling
IEEE Transactions on Circuits and Systems for Video Technology
Nearest-neighbor classifier motivated marginal discriminant projections for face recognition
Frontiers of Computer Science in China
Regularized soft K-means for discriminant analysis
Neurocomputing
Dimensionality reduction with adaptive graph
Frontiers of Computer Science: Selected Publications from Chinese Universities
Hi-index | 0.01 |
Graph construction plays a key role on learning algorithms based on graph Laplacian. However, the traditional graph construction approaches of @e-neighborhood and k-nearest-neighbor need to predefine the same neighbor parameter @e (or k) for all samples, which usually suffers from the difficulty of parameter selection and generally fail to effectively fit intrinsic structures of data. To mitigate these limitations to a certain extent, in this paper we present a novel and sample-dependent approach of graph construction, and name the so-constructed graph as Sample-dependent Graph (SG). Specifically, instead of predefining the same neighbor parameter for all samples, the SG depends on samples in question to determine neighbors of each sample and similarities between sample pairs. As a result, it not only avoids the intractability and high expense of neighbor parameter selection but also can more effectively fit the intrinsic structures of data. Further, in order to show the effectiveness of the SG, we apply it to the dimensionality reduction based on graph embedding, and incorporate it into the state-of-the-art off-the-shelf unsupervised locality preserving projection (LPP) to develop the sample-dependent LPP (SLPP). SLPP naturally inherits the merits of SG and maintains the attractive properties of the traditional LPP. The experiments on the toy and benchmark (UCI, face recognition, object category and handwritten digits recognition) datasets show the effectiveness and feasibility of the SG and SLPP with promising results.