Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Special issue on parallel processing and statistics
Computational Statistics & Data Analysis - Special issue on parallel processing and statistics
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Machine Learning
Boosting Algorithms for Parallel and Distributed Learning
Distributed and Parallel Databases - Special issue: Parallel and distributed data mining
Logistic Regression, AdaBoost and Bregman Distances
Machine Learning
Highlighting Hard Patterns via AdaBoost Weights Evolution
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
The Dynamics of AdaBoost: Cyclic Behavior and Convergence of Margins
The Journal of Machine Learning Research
Handbook of Parallel Computing and Statistics (Statistics, Textbooks and Monographs)
Handbook of Parallel Computing and Statistics (Statistics, Textbooks and Monographs)
Computational Statistics & Data Analysis
Artificial Intelligence Review
Parallel Approach for Ensemble Learning with Locally Coupled Neural Networks
Neural Processing Letters
Attribute-based vehicle search in crowded surveillance videos
Proceedings of the 1st ACM International Conference on Multimedia Retrieval
Hi-index | 0.03 |
AdaBoost is one of the most popular classification methods. In contrast to other ensemble methods (e.g., Bagging) the AdaBoost is inherently sequential. In many data intensive real-world situations this may limit the practical applicability of the method. P-AdaBoost is a novel scheme for the parallelization of AdaBoost, which builds upon earlier results concerning the dynamics of AdaBoost weights. P-AdaBoost yields approximations to the standard AdaBoost models that can be easily and efficiently distributed over a network of computing nodes. Properties of P-AdaBoost as a stochastic minimizer of the AdaBoost cost functional are discussed. Experiments are reported on both synthetic and benchmark data sets.