Boosting the margin: A new explanation for the effectiveness of voting methods
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
An Analysis of the Effects of Neighborhood Size and Shape on Local Selection Algorithms
PPSN IV Proceedings of the 4th International Conference on Parallel Problem Solving from Nature
Spatially Structured Evolutionary Algorithms: Artificial Evolution in Space and Time (Natural Computing Series)
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent
The Journal of Machine Learning Research
An analysis of a spatial EA parallel boosting algorithm
Proceedings of the 15th annual conference on Genetic and evolutionary computation
Hi-index | 0.00 |
The scalability of machine learning (ML) algorithms has become increasingly important due to the ever increasing size of datasets and increasing complexity of the models induced. Standard approaches for dealing with this issue generally involve developing parallel and distributed versions of the ML algorithms and/or reducing the dataset sizes via sampling techniques. In this paper we describe an alternative approach that combines features of spatially-structured evolutionary algorithms (SSEAs) with the well-known machine learning techniques of ensemble learning and boosting. The result is a powerful and robust framework for parallelizing ML methods in a way that does not require changes to the ML methods. We first describe the framework and illustrate its behavior on a simple synthetic problem, and then evaluate its scalability and robustness using several different ML methods on a set of benchmark problems from the UC Irvine ML database.