Large margin non-linear embedding
ICML '05 Proceedings of the 22nd international conference on Machine learning
Local distance preservation in the GP-LVM through back constraints
ICML '06 Proceedings of the 23rd international conference on Machine learning
Local Fisher discriminant analysis for supervised dimensionality reduction
ICML '06 Proceedings of the 23rd international conference on Machine learning
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Temporal motion models for monocular and multiview 3D human body tracking
Computer Vision and Image Understanding - Special issue on modeling people: Vision-based understanding of a person's shape, appearance, movement, and behaviour
Text Classification on Embedded Manifolds
IBERAMIA '08 Proceedings of the 11th Ibero-American conference on AI: Advances in Artificial Intelligence
Probabilistic feature extraction from multivariate time series using spatio-temporal constraints
PAKDD'11 Proceedings of the 15th Pacific-Asia conference on Advances in knowledge discovery and data mining - Volume Part II
Large-margin multi-view Gaussian process for image classification
Proceedings of the Fifth International Conference on Internet Multimedia Computing and Service
Nonparametric guidance of autoencoder representations using label information
The Journal of Machine Learning Research
Charting-based subspace learning for video-based human action classification
Machine Vision and Applications
Hi-index | 0.00 |
Supervised learning is difficult with high dimensional input spaces and very small training sets, but accurate classification may be possible if the data lie on a low-dimensional manifold. Gaussian Process Latent Variable Models can discover low dimensional manifolds given only a small number of examples, but learn a latent space without regard for class labels. Existing methods for discriminative manifold learning (e.g., LDA, GDA) do constrain the class distribution in the latent space, but are generally deterministic and may not generalize well with limited training data. We introduce a method for Gaussian Process Classification using latent variable models trained with discriminative priors over the latent space, which can learn a discriminative latent space from a small training set.