Genetic algorithms + data structures = evolution programs (2nd, extended ed.)
Genetic algorithms + data structures = evolution programs (2nd, extended ed.)
The nature of statistical learning theory
The nature of statistical learning theory
A comparison of genetic algorithms and other machine learning systems on a complex classification task from common disease research
Journal of Parallel and Distributed Computing
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Estimating the Predictive Accuracy of a Classifier
EMCL '01 Proceedings of the 12th European Conference on Machine Learning
Asymptotic behaviors of support vector machines with Gaussian kernel
Neural Computation
An introduction to variable and feature selection
The Journal of Machine Learning Research
Neural Computation
Proceedings of the 2008 ACM symposium on Applied computing
A distributed PSO-SVM hybrid system with feature selection and parameter optimization
Applied Soft Computing
Evolutionary feature and parameter selection in support vector regression
MICAI'07 Proceedings of the artificial intelligence 6th Mexican international conference on Advances in artificial intelligence
Editorial: Hybrid intelligent algorithms and applications
Information Sciences: an International Journal
Metalearning: Applications to Data Mining
Metalearning: Applications to Data Mining
Hi-index | 0.00 |
Experimental procedures associated with Machine Learning (ML) techniques are usually computationally demanding. An important step for a conscientious allocation of ML tasks into resources is predicting their execution times. Previously, empirical comparisons using a Meta-learning framework indicated that Support Vector Machines (SVM) are suited for this problem; however, their performance is affected by the choice of parameter values and input features. In this paper, we tackle the issue by applying Genetic Algorithm (GA) to perform joint Feature Subset Selection (FSS) and Parameters Optimization (PO). At first, a GA is used for FSS+PO in SVMs with two kernel functions, independently. Later, besides FSS+PO an additional term is evolved to weight predictions of both models to build a combined regressor. An empirical investigation conducted for predicting execution times of 6 ML algorithms over 78 publicly available datasets unveils a higher accuracy when compared with the previous results.