The Strength of Weak Learnability
Machine Learning
Neural networks and the bias/variance dilemma
Neural Computation
Improving regression estimation: Averaging methods for variance reduction with extensions to general convex measure optimization
Boosting a weak learning algorithm by majority
Information and Computation
Machine Learning
Game theory, on-line prediction and boosting
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved boosting algorithms using confidence-rated predictions
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Training methods for adaptive boosting of neural networks
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Direct optimization of margins improves generalization in combined classifiers
Proceedings of the 1998 conference on Advances in neural information processing systems II
Boosting the margin: A new explanation for the effectiveness of voting methods
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
AdaBoosting Neural Networks: Application to on-line Character Recognition
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Constraint Tangent Distance for On-Line Character Recognition
ICPR '96 Proceedings of the International Conference on Pattern Recognition (ICPR '96) Volume IV-Volume 7472 - Volume 7472
An empirical evaluation of bagging and boosting
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Neural Processing Letters
An introduction to boosting and leveraging
Advanced lectures on machine learning
A new approximate maximal margin classification algorithm
The Journal of Machine Learning Research
Neural network ensemble strategies for financial decision applications
Computers and Operations Research
Boosting an Associative Classifier
IEEE Transactions on Knowledge and Data Engineering
Adaptive boosting techniques in heterogeneous and spatial databases
Intelligent Data Analysis
Multiclass classification using neural networks and interval neutrosophic sets
CIMMACS'06 Proceedings of the 5th WSEAS International Conference on Computational Intelligence, Man-Machine Systems and Cybernetics
AdaBoost with SVM-based component classifiers
Engineering Applications of Artificial Intelligence
A New Constructive Algorithm for Designing and Training Artificial Neural Networks
Neural Information Processing
Diversity of ability and cognitive style for group decision processes
Information Sciences: an International Journal
Automated Ham Quality Classification Using Ensemble Unsupervised Mapping Models
KES '07 Knowledge-Based Intelligent Information and Engineering Systems and the XVII Italian Workshop on Neural Networks on Proceedings of the 11th International Conference
Empirical analysis of support vector machine ensemble classifiers
Expert Systems with Applications: An International Journal
Boosting One-Class Support Vector Machines for Multi-Class Classification
Applied Artificial Intelligence
Boosting a fast neural network for supervised land cover classification
Computers & Geosciences
Porosity Prediction Using Bagging of Complementary Neural Networks
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
Ensemble Methods for Boosting Visualization Models
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
AAAI'04 Proceedings of the 19th national conference on Artifical intelligence
Boosting Shift-Invariant Features
Proceedings of the 31st DAGM Symposium on Pattern Recognition
Indexing ICD-9 codes for free-textual clinical diagnosis records by a new ensemble classifier
International Journal of Computational Intelligence in Bioinformatics and Systems Biology
Ensemble classification based on generalized additive models
Computational Statistics & Data Analysis
Cost-sensitive boosting neural networks for software defect prediction
Expert Systems with Applications: An International Journal
Comparison of adaboost and genetic programming for combining neural networks for drug discovery
EvoWorkshops'03 Proceedings of the 2003 international conference on Applications of evolutionary computing
On selection and combination of weak learners in AdaBoost
Pattern Recognition Letters
Boosting unsupervised competitive learning ensembles
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
A weighted voting summarization of SOM ensembles
Data Mining and Knowledge Discovery
Incremental learning by heterogeneous bagging ensemble
ADMA'10 Proceedings of the 6th international conference on Advanced data mining and applications - Volume Part II
Object detection via fusion of global classifier and part-based classifier
ISNN'06 Proceedings of the Third international conference on Advnaces in Neural Networks - Volume Part II
Boosting soft-margin SVM with feature selection for pedestrian detection
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
Ensemble image classification method based on genetic image network
EuroGP'10 Proceedings of the 13th European conference on Genetic Programming
Hi-index | 0.00 |
Boosting is a general method for improving the performance of learning algorithms. A recently proposed boosting algorithm, AdaBoost, has been applied with great success to several benchmark machine learning problems using mainly decision trees as base classifiers. In this article we investigate whether AdaBoost also works as well with neural networks, and we discuss the advantages and drawbacks of different versions of the AdaBoost algorithm. In particular, we compare training methods based on sampling the training set and weighting the cost function. The results suggest that random resampling of the training data is not the main explanation of the success of the improvements brought by AdaBoost. This is in contrast to bagging, which directly aims at reducing variance and for which random resampling is essential to obtain the reduction in generalization error. Our system achieves about 1.4% error on a data set of on-line handwritten digits from more than 200 writers. A boosted multilayer network achieved 1.5% error on the UCI letters and 8.1% error on the UCI satellite data set, which is significantly better than boosted decision trees.