Continuous selections of linear functions and nonsmooth critical point theory
Nonlinear Analysis: Theory, Methods & Applications
Estimation of a regression function by maxima of minima of linear functions
IEEE Transactions on Information Theory
An L2-boosting algorithm for estimation of a regression function
IEEE Transactions on Information Theory
Aggregate codifferential method for nonsmooth DC optimization
Journal of Computational and Applied Mathematics
Hi-index | 0.12 |
The problem of the estimation of a regression function by continuous piecewise linear functions is formulated as a nonconvex, nonsmooth optimization problem. Estimates are defined by minimization of the empirical L 2 risk over a class of functions, which are defined as maxima of minima of linear functions. An algorithm for finding continuous piecewise linear functions is presented. We observe that the objective function in the optimization problem is semismooth, quasidifferentiable and piecewise partially separable. The use of these properties allow us to design an efficient algorithm for approximation of subgradients of the objective function and to apply the discrete gradient method for its minimization. We present computational results with some simulated data and compare the new estimator with a number of existing ones.