Elements of information theory
Elements of information theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Principal Surfaces from Unsupervised Kernel Regression
IEEE Transactions on Pattern Analysis and Machine Intelligence
Separating Style and Content with Bilinear Models
Neural Computation
Twin Gaussian Processes for Structured Prediction
International Journal of Computer Vision
Hi-index | 0.00 |
We describe a family of embedding algorithms that are based on nonparametric estimates of mutual information (MI). Using Parzen window estimates of the distribution in the joint (input, embedding)-space, we derive a MI-based objective function for dimensionality reduction that can be optimized directly with respect to a set of latent data representatives. Various types of supervision signal can be introduced within the framework by replacing plain MI with several forms of conditional MI. Examples of the semi-(un)supervised algorithms that we obtain this way are a new model for manifold alignment, and a new type of embedding method that performs 'conditional dimensionality reduction'.