Natural gradient works efficiently in learning
Neural Computation
Exploiting generative models in discriminative classifiers
Proceedings of the 1998 conference on Advances in neural information processing systems II
Flexible discriminant and mixture models
Statistics and neural networks
Self-Organizing Maps
Clustering based on conditional distributions in an auxiliary space
Neural Computation
A new discriminative kernel from probabilistic models
Neural Computation
Multivariate Information Bottleneck
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
Generalized relevance learning vector quantization
Neural Networks - New developments in self-organizing maps
A Model-Based Distance for Clustering
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 4 - Volume 4
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Principle of Learning Metrics for Exploratory Data Analysis
Journal of VLSI Signal Processing Systems
Bankruptcy analysis with self-organizing maps in learning metrics
IEEE Transactions on Neural Networks
Matrix Learning for Topographic Neural Maps
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Information Retrieval Perspective to Nonlinear Dimensionality Reduction for Data Visualization
The Journal of Machine Learning Research
Local matrix adaptation in topographic neural maps
Neurocomputing
Relevance learning in generative topographic mapping
Neurocomputing
A general framework for dimensionality-reducing data visualization mapping
Neural Computation
ANNPR'06 Proceedings of the Second international conference on Artificial Neural Networks in Pattern Recognition
Discriminative dimensionality reduction mappings
IDA'12 Proceedings of the 11th international conference on Advances in Intelligent Data Analysis
Using nonlinear dimensionality reduction to visualize classifiers
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
Hi-index | 0.00 |
We have earlier introduced a principle for learning metrics, which shows how metric-based methods can be made to focus on discriminative properties of data. The main applications are in supervising unsupervised learning to model interesting variation in data, instead of modeling all variation as plain unsupervised learning does. The metrics are derived by approximations to an information-geometric formulation. In this paper, we review the theory, introduce better approximations to the distances, and show how to apply them in two different kinds of unsupervised methods: prototype-based and pairwise distance-based. The two examples are self-organizing maps and multidimensional scaling (Sammon's mapping).