Empirical model-building and response surface
Empirical model-building and response surface
Connectionist learning of belief networks
Artificial Intelligence
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Learning a kernel matrix for nonlinear dimensionality reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
A kernel view of the dimensionality reduction of manifolds
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning Eigenfunctions Links Spectral Embedding and Kernel PCA
Neural Computation
Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models
The Journal of Machine Learning Research
The Journal of Machine Learning Research
WiFi-SLAM using Gaussian process latent variable models
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Probabilistic Graphical Models: Principles and Techniques - Adaptive Computation and Machine Learning
Hi-index | 0.00 |
We introduce a new perspective on spectral dimensionality reduction which views these methods as Gaussian Markov random fields (GRFs). Our unifying perspective is based on the maximum entropy principle which is in turn inspired by maximum variance unfolding. The resulting model, which we call maximum entropy unfolding (MEU) is a nonlinear generalization of principal component analysis. We relate the model to Laplacian eigenmaps and isomap. We show that parameter fitting in the locally linear embedding (LLE) is approximate maximum likelihood MEU. We introduce a variant of LLE that performs maximum likelihood exactly: Acyclic LLE (ALLE). We show that MEU and ALLE are competitive with the leading spectral approaches on a robot navigation visualization and a human motion capture data set. Finally the maximum likelihood perspective allows us to introduce a new approach to dimensionality reduction based on L1 regularization of the Gaussian random field via the graphical lasso.