Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Self-Organizing Maps
Clustering based on conditional distributions in an auxiliary space
Neural Computation
Neighborhood Preservation in Nonlinear Projection Methods: An Experimental Study
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Bankruptcy analysis with self-organizing maps in learning metrics
IEEE Transactions on Neural Networks
Learning a Mahalanobis distance metric for data clustering and classification
Pattern Recognition
Features and Metric from a Classifier Improve Visualizations with Dimension Reduction
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part II
Hi-index | 0.00 |
Improved methods are presented for learning metrics that measure only important distances. It is assumed that changes in primary data are relevant only to the extent that they cause changes in auxiliary data, available paired with the primary data. The metrics are here derived from estimators of the conditional density of the auxiliary data. More accurate estimators are compared, and a more accurate approximation to the distances is introduced. The new methods improved the quality of Self-Organizing Maps (SOMs) significantly for four of the five studied data sets.