An application of the principle of maximum information preservation to linear systems
Advances in neural information processing systems 1
Elements of information theory
Elements of information theory
GTM: the generative topographic mapping
Neural Computation
Mutual information, Fisher information, and population coding
Neural Computation
A view of the EM algorithm that justifies incremental, sparse, and other variants
Learning in graphical models
Mixtures of probabilistic principal component analyzers
Neural Computation
Deriving Receptive Fields Using an Optimal Encoding Criterion
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Adaptive mixtures of local experts
Neural Computation
Hi-index | 0.00 |
Mutual Information (MI) is a long studied measure of information content, and many attempts to apply it to feature extraction and stochastic coding have been made. However, in general MI is computationally intractable to evaluate, and most previous studies redefine the criterion in forms of approximations. Recently we described properties of a simple lower bound on MI, and discussed its links to some of the popular dimensionality reduction techniques [1]. Here we introduce a richer family of auxiliary variational bounds on MI, which generalizes our previous approximations. Our specific focus then is on applying the bound to extracting informative lower-dimensional projections in the presence of irreducible Gaussian noise. We show that our method produces significantly tighter bounds than the well-known as-if Gaussian approximations of MI. We also show that the auxiliary variable method may help to significantly improve on reconstructions from noisy lower-dimensional projections.