Davenport-Schinzel sequences and their geometric applications
Davenport-Schinzel sequences and their geometric applications
A tutorial on support vector regression
Statistics and Computing
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
Two-dimensional solution path for support vector regression
ICML '06 Proceedings of the 23rd international conference on Machine learning
Efficient Computation and Model Selection for the Support Vector Regression
Neural Computation
The Journal of Machine Learning Research
Nonparametric Quantile Estimation
The Journal of Machine Learning Research
High-quantile modeling for customer wallet estimation and other applications
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Bi-Level Path Following for Cross Validated Solution of Kernel Quantile Regression
The Journal of Machine Learning Research
The structured elastic net for quantile regression and support vector classification
Statistics and Computing
Hi-index | 0.00 |
Modeling of conditional quantiles requires specification of the quantile being estimated and can thus be viewed as a parameterized predictive modeling problem. Quantile loss is typically used, and it is indeed parameterized by a quantile parameter. In this paper we show how to follow the path of cross validated solutions to regularized kernel quantile regression. Even though the bi-level optimization problem we encounter for every quantile is non-convex, the manner in which the optimal cross-validated solution evolves with the parameter of the loss function allows tracking of this solution. We prove this property, construct the resulting algorithm, and demonstrate it on data. This algorithm allows us to efficiently solve the whole family of bi-level problems.