A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Optimizing classifiers for imbalanced training sets
Proceedings of the 1998 conference on Advances in neural information processing systems II
Exploiting Classifier Combination for Early Melanoma Diagnosis Support
ECML '00 Proceedings of the 11th European Conference on Machine Learning
AdaCost: Misclassification Cost-Sensitive Boosting
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
A Comparative Study of Cost-Sensitive Boosting Algorithms
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Bootstrap Methods for the Cost-Sensitive Evaluation of Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Boosting of Tree-Based Classifiers for Predictive Risk Modeling in GIS
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Improving the Practice of Classifier Performance Assessment
Neural Computation
Ensembles of Learning Machines
WIRN VIETRI 2002 Proceedings of the 13th Italian Workshop on Neural Nets-Revised Papers
Boosting and Classification of Electronic Nose Data
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Methodological review: Computerized analysis of pigmented skin lesions: A review
Artificial Intelligence in Medicine
Threshold optimisation for multi-label classifiers
Pattern Recognition
Hi-index | 0.00 |
This paper investigates a methodology for effective model selection of cost-sensitive boosting algorithms. In many real situations, e.g. for automated medical diagnosis, it is crucial to tune the classification performance towards the sensitivity and specificity required by the user. To this purpose, for binary classification problems, we have designed a cost-sensitive variant of AdaBoost where (1) the model error function is weighted with separate costs for errors (false negative and false positives) in the two classes, and (2) the weights are updated differently for negatives and positives at each boosting step. Finally, (3) a practical search procedure allows to get into or as close as possible to the sensitivity and specificity constraints without an extensive tabulation of the ROC curve. This off-the-shelf methodology was applied for the automatic diagnosis of melanoma on a set of 152 skin lesions described by geometric and colorimetric features, out-performing, on the same data set, skilled dermatologists and a specialized automatic system based on a multiple classifier combination.