The Strength of Weak Learnability
Machine Learning
Original Contribution: Stacked generalization
Neural Networks
Decision Combination in Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
IEEE Transactions on Pattern Analysis and Machine Intelligence
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Neural Computation
Feature combination using boosting
Pattern Recognition Letters
An analysis of diversity measures
Machine Learning
Ensemble Pruning Via Semi-definite Programming
The Journal of Machine Learning Research
Editorial: A special issue on applications of ensemble methods
Information Fusion
Guest Editorial: Applications of ensemble methods
Information Fusion
An Analysis of Ensemble Pruning Techniques Based on Ordered Aggregation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Selective Ensemble under Regularization Framework
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
Predictive Ensemble Pruning by Expectation Propagation
IEEE Transactions on Knowledge and Data Engineering
GSML: A Unified Framework for Sparse Metric Learning
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
Classifier combination based on confidence transformation
Pattern Recognition
Sparse ensembles using weighted combination methods based on linear programming
Pattern Recognition
Multiobjective Neural Network Ensembles Based on Regularized Negative Correlation Learning
IEEE Transactions on Knowledge and Data Engineering
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Generalized sparse metric learning with relative comparisons
Knowledge and Information Systems
Exchange rate prediction with non-numerical information
Neural Computing and Applications - Special Issue on ICONIP2009
FMI image based rock structure classification using classifier combination
Neural Computing and Applications - Special Issue on ICONIP2009
Making use of population information in evolutionary artificialneural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Switching between selection and fusion in combining classifiers: anexperiment
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Ensembling local learners ThroughMultimodal perturbation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Local Classifier Weighting by Quadratic Programming
IEEE Transactions on Neural Networks
Regularized Negative Correlation Learning for Neural Network Ensembles
IEEE Transactions on Neural Networks
Learning Ensembles of Neural Networks by Means of a Bayesian Artificial Immune System
IEEE Transactions on Neural Networks
Ensemble Methods: Foundations and Algorithms
Ensemble Methods: Foundations and Algorithms
Diversity regularized ensemble pruning
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part I
Classifier ensemble using a heuristic learning with sparsity and diversity
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
Hi-index | 0.01 |
We consider the classifier ensemble problem in this paper. Due to its superior performance to individual classifiers, class ensemble has been intensively studied in the literature. Generally speaking, there are two prevalent research directions on this, i.e., to diversely generate classifier components, and to sparsely combine multiple classifiers. While most current approaches are emphasized on either sparsity or diversity only, we investigate the classifier ensemble by learning both sparsity and diversity simultaneously. We manage to formulate the classifier ensemble problem with the sparsity or/and diversity learning in a general framework. In particular, the classifier ensemble with sparsity and diversity can be represented as a mathematical optimization problem. We then propose a heuristic algorithm, capable of obtaining ensemble classifiers with consideration of both sparsity and diversity. We exploit the genetic algorithm, and optimize sparsity and diversity for classifier selection and combination heuristically and iteratively. As one major contribution, we introduce the concept of the diversity contribution ability so as to select proper classifier components and evolve classifier weights eventually. Finally, we compare our proposed novel method with other conventional classifier ensemble methods such as Bagging, least squares combination, sparsity learning, and AdaBoost, extensively on UCI benchmark data sets and the Pascal Large Scale Learning Challenge 2008 webspam data. The experimental results confirm that our approach leads to better performance in many aspects.