Risk-sensitive loss functions for sparse multi-category classification problems
Information Sciences: an International Journal
Quantile regression model for impact toughness estimation
ICDM'10 Proceedings of the 10th industrial conference on Advances in data mining: applications and theoretical aspects
Exceedance probability estimation for a quality test consisting of multiple measurements
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
A learning machine-or a model-is usually trained by minimizing a given criterion (the expectation of the cost function), measuring the discrepancy between the model output and the desired output. As is already well known, the choice of the cost function has a profound impact on the probabilistic interpretation of the output of the model, after training. In this work, we use the calculus of variations in order to tackle this problem. In particular, we derive necessary and sufficient conditions on the cost function ensuring that the output of the trained model approximates 1) the conditional expectation of the desired output given the explanatory variables; 2) the conditional median (and, more generally the q-quantile); 3) the conditional geometric mean; and 4) the conditional variance. The same method could be applied to the estimation of other summary statistics as well. We also argue that the least absolute deviations criterion could, in some cases, act as an alternative to the ordinary least squares criterion for nonlinear regression. In the same vein, the concept of "regression quantile" is briefly discussed.