An introduction to computational learning theory
An introduction to computational learning theory
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Boosting as entropy projection
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
An Adaptive Version of the Boost by Majority Algorithm
Machine Learning
ICGI '98 Proceedings of the 4th International Colloquium on Grammatical Inference
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
A Real generalization of discrete AdaBoost
Artificial Intelligence
Information theoretic regularization for semi-supervised boosting
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Hi-index | 0.00 |
In this paper we focus on the adaptation of boosting to grammatical inference. We aim at improving the performance of state merging algorithms in the presence of noisy data by using, in the update rule, additional information provided by an oracle. This strategy requires the construction of a new weighting scheme that takes into account the confidence in the labels of the examples. We prove that our new framework preserves the theoretical properties of boosting. Using the state merging algorithm RPNI*, we describe an experimental study on various datasets, showing a dramatic improvement of performances.