The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Knowledge Acquisition form Examples Vis Multiple Models
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Lowering variance of decisions by using artificial neural network portfolios
Neural Computation
An empirical evaluation of bagging and boosting
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
An introduction to boosting and leveraging
Advanced lectures on machine learning
Collaborative prediction using ensembles of Maximum Margin Matrix Factorizations
ICML '06 Proceedings of the 23rd international conference on Machine learning
Computational Statistics & Data Analysis
Artificial Intelligence Review
Learning the bias of a classifier in a GA-Based inductive learning environment
ICIC'05 Proceedings of the 2005 international conference on Advances in Intelligent Computing - Volume Part I
A MapReduce-based distributed SVM ensemble for scalable image classification and annotation
Computers & Mathematics with Applications
Hi-index | 0.00 |
We propose and study a new technique for aggregating an ensemble of bootstrapped classifiers. In this method we seek a linear combination of the base-classifiers such that the weights are optimized to reduce variance. Minimum variance combinations are computed using quadratic programming. This optimization technique is borrowed from Mathematical Finance where it is called Markowitz Mean-Variance Portfolio Optimization. We test the new method on a number of binary classification problems from the UCI repository using a Support Vector Machine (SVM) as the base-classifier learning algorithm. Our results indicate that the proposed technique can consistently outperform Bagging and can dramatically improve the SVM performance even in cases where the Bagging fails to improve the base-classifier.