Artificial Intelligence Review - Special issue on lazy learning
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Completely Derandomized Self-Adaptation in Evolution Strategies
Evolutionary Computation
A derandomized approach to self-adaptation of evolution strategies
Evolutionary Computation
Fast learning in networks of locally-tuned processing units
Neural Computation
Evolutionary discriminant analysis
IEEE Transactions on Evolutionary Computation
Hi-index | 0.00 |
Many classification algorithms use the concept of distance or similarity between patterns. Previous work has shown that it is advantageous to optimize general Euclidean distances (GED). In this paper, data transformations are optimized instead. This is equivalent to searching for GEDs, but can be applied to any learning algorithm, even if it does not use distances explicitly. Two optimization techniques have been used: a simple Local Search (LS) and the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). CMA-ES is an advanced evolutionary method for optimization in difficult continuous domains. Both diagonal and complete matrices have been considered. Results show that in general, complete matrices found by CMA-ES either outperform or match both Local Search, and the classifier working on the original untransformed data.