Latent variable models and factors analysis
Latent variable models and factors analysis
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
A probabilistic framework for semi-supervised clustering
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning a Mahalanobis Metric from Equivalence Constraints
The Journal of Machine Learning Research
Gaussian Processes for Classification: Mean-Field Algorithms
Neural Computation
Local distance preservation in the GP-LVM through back constraints
ICML '06 Proceedings of the 23rd international conference on Machine learning
Supervised probabilistic principal component analysis
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Learning Distance Metrics with Contextual Constraints for Image Retrieval
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models
The Journal of Machine Learning Research
Robust self-tuning semi-supervised learning
Neurocomputing
Locality sensitive semi-supervised feature selection
Neurocomputing
Pairwise constraint propagation by semidefinite programming for semi-supervised classification
Proceedings of the 25th international conference on Machine learning
Iterative subspace analysis based on feature line distance
IEEE Transactions on Image Processing
Riemannian manifold learning for nonlinear dimensionality reduction
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Gabor-Based Region Covariance Matrices for Face Recognition
IEEE Transactions on Circuits and Systems for Video Technology
Dimensionality reduction with adaptive graph
Frontiers of Computer Science: Selected Publications from Chinese Universities
Hi-index | 0.01 |
In machine learning, Gaussian process latent variable model (GP-LVM) has been extensively applied in the field of unsupervised dimensionality reduction. When some supervised information, e.g., pairwise constraints or labels of the data, is available, the traditional GP-LVM cannot directly utilize such supervised information to improve the performance of dimensionality reduction. In this case, it is necessary to modify the traditional GP-LVM to make it capable of handing the supervised or semi-supervised learning tasks. For this purpose, we propose a new semi-supervised GP-LVM framework under the pairwise constraints. Through transferring the pairwise constraints in the observed space to the latent space, the constrained priori information on the latent variables can be obtained. Under this constrained priori, the latent variables are optimized by the maximum a posteriori (MAP) algorithm. The effectiveness of the proposed algorithm is demonstrated with experiments on a variety of data sets.