Communications of the ACM
What size net gives valid generalization?
Neural Computation
The Strength of Weak Learnability
Machine Learning
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
C4.5: programs for machine learning
C4.5: programs for machine learning
Cryptographic limitations on learning Boolean formulae and finite automata
Journal of the ACM (JACM)
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Boosting a weak learning algorithm by majority
Information and Computation
Game theory, on-line prediction and boosting
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved boosting algorithms using confidence-rated predictions
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Training methods for adaptive boosting of neural networks
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Boosting the margin: A new explanation for the effectiveness of voting methods
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
An empirical evaluation of bagging and boosting
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
IEEE Transactions on Information Theory
A Representation for Accuracy-Based Assessment of Classifier System Prediction Performance
IWLCS '01 Revised Papers from the 4th International Workshop on Advances in Learning Classifier Systems
Weighted Majority Decision among Several Region Rules for Scientific Discovery
DS '99 Proceedings of the Second International Conference on Discovery Science
Using Diversity with Three Variants of Boosting: Aggressive, Conservative, and Inverse
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Some Results on Weakly Accurate Base Learners for Boosting Regression and Classification
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Smooth Boosting and Learning with Malicious Noise
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
An Unsupervised Collaborative Learning Method to Refine Classification Hierarchies
ICTAI '99 Proceedings of the 11th IEEE International Conference on Tools with Artificial Intelligence
Neural Network Ensembles from Training Set Expansions
CIARP '09 Proceedings of the 14th Iberoamerican Conference on Pattern Recognition: Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
International Journal of Approximate Reasoning
Combining techniques for software quality classification: An integrated decision network approach
Expert Systems with Applications: An International Journal
A novel training weighted ensemble (TWE) with application to face recognition
Applied Soft Computing
Quadratic error minimization in a distributed environment with privacy preserving
PSDML'10 Proceedings of the international ECML/PKDD conference on Privacy and security issues in data mining and machine learning
Hybrid artificial neural networks: models, algorithms and data
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part II
An ensemble of degraded neural networks
MCPR'11 Proceedings of the Third Mexican conference on Pattern recognition
Strengthening learning algorithms by feature discovery
Information Sciences: an International Journal
Training regression ensembles by sequential target correction and resampling
Information Sciences: an International Journal
Privacy Preserving Aggregation of Secret Classifiers
Transactions on Data Privacy
Hi-index | 0.03 |
Boosting is a general method for improving the accuracy of any given learning algorithm. Focusing primarily on the AdaBoost algorithm, we briefly survey theoretical work on boosting including analyses of AdaBoost's training error and generalization error, connections between boosting and game theory, methods of estimating probabilities using boosting, and extensions of AdaBoost for multiclass classification problems. We also briefly mention some empirical work.