The Strength of Weak Learnability
Machine Learning
Boosting a weak learning algorithm by majority
Information and Computation
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Separate-and-Conquer Rule Learning
Artificial Intelligence Review
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
IEEE Transactions on Pattern Analysis and Machine Intelligence
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Neural Learning from Unbalanced Data
Applied Intelligence
SMOTE: synthetic minority over-sampling technique
Journal of Artificial Intelligence Research
Dynamic three-bin real AdaBoost using biased classifiers: an application in face detection
BTAS'09 Proceedings of the 3rd IEEE international conference on Biometrics: Theory, applications and systems
Hi-index | 0.00 |
We propose a novel ensemble learning algorithm called Triskel, which has two interesting features. First, Triskel learns an ensemble of classifiers that are biased to have high precision (as opposed to, for example, boosting, where the ensemble members are biased to ignore portions of the instance space). Second, Triskel uses weighted voting like most ensemble methods, but the weights are assigned so that certain pairs of biased classifiers outweigh the rest of the ensemble, if their predictions agree. Our experiments on a variety of real-world tasks demonstrate that Triskel often outperforms boosting, in terms of both accuracy and training time.