A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting Methods for Regression
Machine Learning
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Nonparametric Quantile Estimation
The Journal of Machine Learning Research
Boosted Classification Trees and Class Probability/Quantile Estimation
The Journal of Machine Learning Research
Proceedings of the 24th international conference on Machine learning
Cost-sensitive boosting for classification of imbalanced data
Pattern Recognition
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Sharing features: efficient boosting procedures for multiclass object detection
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Functional gradient ascent for Probit regression
Pattern Recognition
Accurate Prediction of Coronary Artery Disease Using Reliable Diagnosis System
Journal of Medical Systems
Bayesian lasso binary quantile regression
Computational Statistics
Hi-index | 12.05 |
In the framework of functional gradient descent/ascent, this paper proposes Quantile Boost (QBoost) algorithms which predict quantiles of the interested response for regression and binary classification. Quantile Boost Regression performs gradient descent in functional space to minimize the objective function used by quantile regression (QReg). In the classification scenario, the class label is defined via a hidden variable, and the quantiles of the class label are estimated by fitting the corresponding quantiles of the hidden variable. An equivalent form of the definition of quantile is introduced, whose smoothed version is employed as the objective function, and then maximized by functional gradient ascent to obtain the Quantile Boost Classification algorithm. Extensive experimentation and detailed analysis show that QBoost performs better than the original QReg and other alternatives for regression and binary classification. Furthermore, QBoost is capable of solving problems in high dimensional space and is more robust to noisy predictors.