A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
A Hierarchical Latent Variable Model for Data Visualization
IEEE Transactions on Pattern Analysis and Machine Intelligence
Text Classification from Labeled and Unlabeled Documents using EM
Machine Learning - Special issue on information retrieval
Machine Learning
A Mixed Ensemble Approach for the Semi-supervised Problem
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Semi-supervised Robust Alternating AdaBoost
CIARP '09 Proceedings of the 14th Iberoamerican Conference on Pattern Recognition: Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
Learning distance functions for image retrieval
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Improving Logitboost with prior knowledge
Information Fusion
Hi-index | 0.00 |
This paper introduces MixtBoost, a variant of AdaBoost dedicated to solve problems in which both labeled and unlabeled data are available. We propose several definitions of loss for unlabeled data, from which margins are defined. The resulting boosting schemes implement mixture models as base classifiers. Preliminary experiments are analyzed and the relevance of loss choices is discussed. MixtBoost improves on both mixture models and AdaBoost provided classes are structured, and is otherwise similar to AdaBoost.