A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting and Rocchio applied to text filtering
Proceedings of the 21st annual international ACM SIGIR conference on Research and development in information retrieval
Training methods for adaptive boosting of neural networks
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Using Decision Trees to Construct a Practical Parser
Machine Learning - Special issue on natural language learning
BoosTexter: A Boosting-based Systemfor Text Categorization
Machine Learning - Special issue on information retrieval
A Unifeid Bias-Variance Decomposition and its Applications
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Tuning Cost-Sensitive Boosting and Its Application to Melanoma Diagnosis
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
Boosting of Tree-Based Classifiers for Predictive Risk Modeling in GIS
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Boosting, Bagging, and Consensus Based Classification of Multisource Remote Sensing Data
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
A brief introduction to boosting
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Hi-index | 0.00 |
Boosting methods are known to improve generalization performances of learning algorithms reducing both bias and variance or enlarging the margin of the resulting multi-classifier system. In this contribution we applied Adaboost to the discrimination of different types of coffee using data produced with an Electronic Nose. Two groups of coffees (blends and monovarieties), consisting of seven classes each, have been analyzed. The boosted ensemble of Multi-Layer Perceptrons was able to halve the classification error for the blends data and to diminish it from 21% to 18% for the more difficult monovarieties data set.