Robust regression and outlier detection
Robust regression and outlier detection
Parallel genetic algorithms with local search
Computers and Operations Research - Special issue: artificial intelligence, evolutionary programming and operations research
Genetic Local Search Algorithms for the Travelling Salesman Problem
PPSN I Proceedings of the 1st Workshop on Parallel Problem Solving from Nature
Learning from Data: Concepts, Theory, and Methods
Learning from Data: Concepts, Theory, and Methods
Clustering with a genetically optimized approach
IEEE Transactions on Evolutionary Computation
Why so many clustering algorithms: a position paper
ACM SIGKDD Explorations Newsletter
Fast Randomized Algorithms for Robust Estimation of Location
TSDM '00 Proceedings of the First International Workshop on Temporal, Spatial, and Spatio-Temporal Data Mining-Revised Papers
AUTOCLUST+: Automatic Clustering of Point-Data Sets in the Presence of Obstacles
TSDM '00 Proceedings of the First International Workshop on Temporal, Spatial, and Spatio-Temporal Data Mining-Revised Papers
An improved hybrid genetic clustering algorithm
SETN'06 Proceedings of the 4th Helenic conference on Advances in Artificial Intelligence
Hi-index | 0.00 |
Iterative methods and genetic algorithms have been used separately to minimize the loss function of representative-based clustering formulations. Neither of them alone seems to be significantly better. Moreover, the trade-off of effort versus quality slightly favors gradient descent. We present a unifying view for the three most popular loss functions: least sum of squares, its fuzzy version and the log likelihood function. We identify commonalities in gradient descent algorithms for the three loss functions and the evaluation of the loss function itself. We can then construct hybrids (genetic algorithms with a mutation operation that performs few gradient descent steps) for all three clustering approaches. We demonstrate that these hybrids are much efficient and effective (significantly render better performance as normalized by the number of function evaluations or CPU time).