The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Efficient SVM Regression Training with SMO
Machine Learning
Boosting Methods for Regression
Machine Learning
Improving Regressors using Boosting Techniques
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Performance Degradation in Boosting
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Asymptotic behaviors of support vector machines with Gaussian kernel
Neural Computation
Bias-Variance Analysis of Support Vector Machines for the Development of SVM-Based Ensemble Methods
The Journal of Machine Learning Research
The Dynamics of AdaBoost: Cyclic Behavior and Convergence of Margins
The Journal of Machine Learning Research
Smooth ε-Insensitive Regression by Loss Symmetrization
The Journal of Machine Learning Research
Generic Object Recognition with Boosting
IEEE Transactions on Pattern Analysis and Machine Intelligence
Experiments with AdaBoost.RT, an improved boosting scheme for regression
Neural Computation
Managing Diversity in Regression Ensembles
The Journal of Machine Learning Research
Efficient Margin Maximizing with Boosting
The Journal of Machine Learning Research
A Real generalization of discrete AdaBoost
Artificial Intelligence
AdaBoost with SVM-based component classifiers
Engineering Applications of Artificial Intelligence
Boosting encoded dynamic features for facial expression recognition
Pattern Recognition Letters
Gene boosting for cancer classification based on gene expression profiles
Pattern Recognition
SemiBoost: Boosting for Semi-Supervised Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Boosting with pairwise constraints
Neurocomputing
Boosting through optimization of margin distributions
IEEE Transactions on Neural Networks
An experimental bias-variance analysis of SVM ensembles based on resampling techniques
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.01 |
Boosting is one of the most important developments in ensemble learning during the past decade. Among different types of boosting methods, AdaBoost is the earliest and the most prevailing one that receives lots of attention for its effectiveness and practicality. Hitherto the research on boosting is dominated by classification problems. Conversely, the extension of boosting to regression is not as successful as that on classification. In this paper, we propose a new approach to extending boosting to regression. This approach first converts a regression sample to a binary classification sample from a geometric point of view, and performs AdaBoost with support vector machines base learner on the converted classification sample. Then the separating hypersurface ensemble obtained from AdaBoost is equivalent to a regression function for the original regression sample. Based on this approach, two new boosting regression methods are presented. The first method adopts the explicit geometric conversion while the second method adopts the implicit geometric conversion. Since both these methods essentially run on the binary classification samples, the convergence property of the standard AdaBoost still holds for them. Experimental results validate the effectiveness of the proposed methods.