Evolution strategies –A comprehensive introduction
Natural Computing: an international journal
Regularized principal manifolds
The Journal of Machine Learning Research
Principal Surfaces from Unsupervised Kernel Regression
IEEE Transactions on Pattern Analysis and Machine Intelligence
Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models
The Journal of Machine Learning Research
A derandomized approach to self-adaptation of evolution strategies
Evolutionary Computation
Dimensionality Reduction by Unsupervised K-Nearest Neighbor Regression
ICMLA '11 Proceedings of the 2011 10th International Conference on Machine Learning and Applications and Workshops - Volume 01
Unsupervised nearest neighbors with kernels
KI'12 Proceedings of the 35th Annual German conference on Advances in Artificial Intelligence
Hi-index | 0.00 |
The detection of structures in high-dimensional data has an important part to play in machine learning. Recently, we proposed a fast iterative strategy for non-linear dimensionality reduction based on the unsupervised formulation of K-nearest neighbor regression. As the unsupervised nearest neighbor (UNN) optimization problem does not allow the computation of derivatives, the employment of direct search methods is reasonable. In this paper we introduce evolutionary optimization approaches for learning UNN embeddings. Two continuous variants are based on the CMA-ES employing regularization with domain restriction, and penalizing extension in latent space. A combinatorial variant is based on embedding the latent variables on a grid, and performing stochastic swaps. We compare the results on artificial dimensionality reduction problems.