Artificial Intelligence Review - Special issue on lazy learning
On Clustering Validation Techniques
Journal of Intelligent Information Systems
Lamarckian Evolution, The Baldwin Effect and Function Optimization
PPSN III Proceedings of the International Conference on Evolutionary Computation. The Third Conference on Parallel Problem Solving from Nature: Parallel Problem Solving from Nature
A Cooperative Coevolutionary Approach to Function Optimization
PPSN III Proceedings of the International Conference on Evolutionary Computation. The Third Conference on Parallel Problem Solving from Nature: Parallel Problem Solving from Nature
Coevolutionary Life-Time Learning
PPSN IV Proceedings of the 4th International Conference on Parallel Problem Solving from Nature
ICCBR '95 Proceedings of the First International Conference on Case-Based Reasoning Research and Development
Examining Locally Varying Weights for Nearest Neighbor Algorithms
ICCBR '97 Proceedings of the Second International Conference on Case-Based Reasoning Research and Development
Weighting Unusual Feature Types
Weighting Unusual Feature Types
Comparison between two coevolutionary feature weighting algorithms in clustering
Pattern Recognition
Hi-index | 0.00 |
Feature weighting is a more and more important step in clustering because data become more and more complex. An embedded local feature weighting method has been proposed in [1]. In this paper, we present a new method based on the same cost function, but performed through a genetic algorithm. The learning process can be performed through an evolutionary approach or through a cooperavive coevolutionary approach. Moreover, the genetic algorithm can be combined with the original Weighting K-means algorithm in a Lamarckian learning paradigm. We compare hill-climbing optimization versus genetic algorithms, evolutionary versus coevolutionary approaches, and Darwinian versus Lamarckian learning on different datasets. The results seem to show that, on the datasets where the original algorithm is efficient, the proposed methods are even better.