The Strength of Weak Learnability
Machine Learning
Original Contribution: Stacked generalization
Neural Networks
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Machine Learning
How to Make Stacking Better and Faster While Also Taking Care of an Unknown Weakness
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Stacking Bagged and Dagged Models
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Multiclassifier Systems: Back to the Future
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Revenue recovering with insolvency prevention on a Brazilian telecom operator
ACM SIGKDD Explorations Newsletter
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Using diversity of errors for selecting members of a committee classifier
Pattern Recognition
High-quantile modeling for customer wallet estimation and other applications
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Autonomously semantifying wikipedia
Proceedings of the sixteenth ACM conference on Conference on information and knowledge management
Top 10 algorithms in data mining
Knowledge and Information Systems
Reproduction and Recognition of Vowel Signals Using Single and Bagging Competitive Associative Nets
Neural Information Processing
Diversity in Combinations of Heterogeneous Classifiers
PAKDD '09 Proceedings of the 13th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Machine learning for automatic mapping of planetary surfaces
IAAI'07 Proceedings of the 19th national conference on Innovative applications of artificial intelligence - Volume 2
Constructing diverse classifier ensembles using artificial training examples
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
A Multi-agent System to Assist with Real Estate Appraisals Using Bagging Ensembles
ICCCI '09 Proceedings of the 1st International Conference on Computational Collective Intelligence. Semantic Web, Social Networks and Multiagent Systems
Speaker Recognition Using Pole Distribution of Speech Signals Obtained by Bagging CAN2
ICONIP '09 Proceedings of the 16th International Conference on Neural Information Processing: Part I
An experimental study on rotation forest ensembles
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
An empirical study of applying ensembles of heterogeneous classifiers on imperfect data
PAKDD'09 Proceedings of the 13th Pacific-Asia international conference on Knowledge discovery and data mining: new frontiers in applied data mining
Face verification based on bagging RBF networks
ICB'06 Proceedings of the 2006 international conference on Advances in Biometrics
Improving bagging performance through multi-algorithm ensembles
Improving bagging performance through multi-algorithm ensembles
Relationship between diversity and correlation in multi-classifier systems
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part II
SINOBIOMETRICS'04 Proceedings of the 5th Chinese conference on Advances in Biometric Person Authentication
Bagging Linear Sparse Bayesian Learning Models for Variable Selection in Cancer Diagnosis
IEEE Transactions on Information Technology in Biomedicine
Bagging tree classifiers for laser scanning images: a data- and simulation-based strategy
Artificial Intelligence in Medicine
Hi-index | 0.00 |
Bagging establishes a committee of classifiers first and then aggregates their outcomes through majority voting. Bagging has attracted considerable research interest and been applied in various application domains. Its advantages include an increased capability of handling small data sets, less sensitivity to noise or outliers, and a parallel structure for efficient implementations. However, it has been found to be less accurate than some other ensemble methods. In this paper, we propose an approach that improves bagging through the employment of multiple classification algorithms in ensembles. Our approach preserves the parallel structure of bagging and improves the accuracy of bagging. As a result, it unlocks the power and expands the user base of bagging.