The Strength of Weak Learnability
Machine Learning
Instance-Based Learning Algorithms
Machine Learning
Induction of one-level decision trees
ML92 Proceedings of the ninth international workshop on Machine learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Machine Learning
Using Correspondence Analysis to Combine Classifiers
Machine Learning
Classification by Voting Feature Intervals
ECML '97 Proceedings of the 9th European Conference on Machine Learning
How to Make Stacking Better and Faster While Also Taking Care of an Unknown Weakness
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Generating Accurate Rule Sets Without Global Optimization
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
AdaCost: Misclassification Cost-Sensitive Boosting
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
Combining Multiple Models with Meta Decision Trees
PKDD '00 Proceedings of the 4th European Conference on Principles of Data Mining and Knowledge Discovery
Direct Marketing Performance Modeling Using Genetic Algorithms
INFORMS Journal on Computing
An Introduction to Data Envelopment Analysis
An Introduction to Data Envelopment Analysis
Classification Rule Discovery with Ant Colony Optimization
IAT '03 Proceedings of the IEEE/WIC International Conference on Intelligent Agent Technology
A hybrid approach for feature subset selection using neural networks and ant colony optimization
Expert Systems with Applications: An International Journal
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Cost-sensitive boosting for classification of imbalanced data
Pattern Recognition
Constructing Ensembles from Data Envelopment Analysis
INFORMS Journal on Computing
Issues in stacked generalization
Journal of Artificial Intelligence Research
Using a local discovery ant algorithm for Bayesian network structure learning
IEEE Transactions on Evolutionary Computation
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
A hybrid approach for efficient ensembles
Decision Support Systems
An ACO-based algorithm for parameter optimization of support vector machines
Expert Systems with Applications: An International Journal
Ensemble pruning via individual contribution ordering
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Estimating continuous distributions in Bayesian classifiers
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Classification rule mining with an improved ant colony algorithm
AI'04 Proceedings of the 17th Australian joint conference on Advances in Artificial Intelligence
Data mining with an ant colony optimization algorithm
IEEE Transactions on Evolutionary Computation
Hi-index | 12.05 |
An ensemble is a collective decision-making system which applies a strategy to combine the predictions of learned classifiers to generate its prediction of new instances. Early research has proved that ensemble classifiers in most cases can be more accurate than any single component classifier both empirically and theoretically. Though many ensemble approaches are proposed, it is still not an easy task to find a suitable ensemble configuration for a specific dataset. In some early works, the ensemble is selected manually according to the experience of the specialists. Metaheuristic methods can be alternative solutions to find configurations. Ant Colony Optimization (ACO) is one popular approach among metaheuristics. In this work, we propose a new ensemble construction method which applies ACO to the stacking ensemble construction process to generate domain-specific configurations. A number of experiments are performed to compare the proposed approach with some well-known ensemble methods on 18 benchmark data mining datasets. The approach is also applied to learning ensembles for a real-world cost-sensitive data mining problem. The experiment results show that the new approach can generate better stacking ensembles.