The Strength of Weak Learnability
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Information Retrieval
Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Membership authentication in the dynamic group by face classification using SVM ensemble
Pattern Recognition Letters
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Complexity of Classification Problems and Comparative Advantages of Combined Classifiers
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Dynamic Weighted Majority: A New Ensemble Method for Tracking Concept Drift
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Integration of Bagging and Boosting with a New Reweighting Technique
CIMCA '05 Proceedings of the International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce Vol-1 (CIMCA-IAWTIC'06) - Volume 01
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pruning in ordered bagging ensembles
ICML '06 Proceedings of the 23rd international conference on Machine learning
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Evaluation of Stability of k-Means Cluster Ensembles with Respect to Random Initialization
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel matching pursuit classifier ensemble
Pattern Recognition
Selective fusion of heterogeneous classifiers
Intelligent Data Analysis
Construction of classifier ensembles by means of artificial immune systems
Journal of Heuristics
Cancer classification using Rotation Forest
Computers in Biology and Medicine
RotBoost: A technique for combining Rotation Forest and AdaBoost
Pattern Recognition Letters
Support Vector Machinery for Infinite Ensemble Learning
The Journal of Machine Learning Research
Expert Systems with Applications: An International Journal
Input Decimated Ensemble based on Neighborhood Preserving Embedding for spectrogram classification
Expert Systems with Applications: An International Journal
Focused Ensemble Selection: A Diversity-Based Method for Greedy Ensemble Selection
Proceedings of the 2008 conference on ECAI 2008: 18th European Conference on Artificial Intelligence
Ensembles of partially trained SWMs with multiplicative updates
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Exploiting diversity in ensembles: improving the performance on unbalanced datasets
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Selective SVMs ensemble driven by immune clonal algorithm
EC'05 Proceedings of the 3rd European conference on Applications of Evolutionary Computing
Matching pursuits with time-frequency dictionaries
IEEE Transactions on Signal Processing
Filter-based optimization techniques for selection of feature subsets in ensemble systems
Expert Systems with Applications: An International Journal
Hi-index | 0.01 |
Decreasing the individual error and increasing the diversity among classifiers are two crucial factors for improving ensemble performances. Nevertheless, the ''kappa-error'' diagram shows that enhancing the diversity is at the expense of reducing individual accuracy. Hence, a new method named Matching Pursuit Optimization Ensemble Classifiers (MPOEC) is proposed in this paper in order to balance the diversity and the individual accuracy. MPOEC method adopts a greedy iterative algorithm of matching pursuit to search for an optimal combination of entire classifiers, and eliminates some similar or poor classifiers by giving zero coefficients. In MPOEC approach, the coefficient of every classifier is gained by minimizing the residual between the target function and the linear combination of the basis functions, especially, when the basis functions are similar, their coefficients will be close to zeros in one iteration of the optimization process, which indicates that obtained coefficients of classifiers are based on the diversity among ensemble individuals. Because some classifiers are given zero coefficients, MPOEC approach may be also considered as a selective classifiers ensemble method. Experimental results show that MPOEC improves the performance compared with other methods. Furthermore, the kappa-error diagrams indicate that the diversity is increased by the proposed method compared with standard ensemble strategies and evolutionary ensemble.