Technical Note: Naive Bayes for Regression
Machine Learning
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
A Simple Approach to Ordinal Classification
EMCL '01 Proceedings of the 12th European Conference on Machine Learning
Modeling Auction Price Uncertainty Using Boosting-based Conditional Density Estimation
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Inference for the Generalization Error
Machine Learning
Predicting probability distributions for surf height using an ensemble of mixture density networks
ICML '05 Proceedings of the 22nd international conference on Machine learning
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Nonparametric Quantile Estimation
The Journal of Machine Learning Research
Using neural networks to model conditional multivariate densities
Neural Computation
Calibrating Probability Density Forecasts with Multi-objective Search
Proceedings of the 2006 conference on ECAI 2006: 17th European Conference on Artificial Intelligence August 29 -- September 1, 2006, Riva del Garda, Italy
Interpolating conditional density trees
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Making good probability estimates for regression
ECML'06 Proceedings of the 17th European conference on Machine Learning
Confidence estimation methods for neural networks: a practical comparison
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Many regression schemes deliver a point estimate only, but often it is useful or even essential to quantify the uncertainty inherent in a prediction. If a conditional density estimate is available, then prediction intervals can be derived from it. In this paper we compare three techniques for computing conditional density estimates using a class probability estimator, where this estimator is applied to the discretized target variable and used to derive instance weights for an underlying univariate density estimator; this yields a conditional density estimate. The three density estimators we compare are: a histogram estimator that has been used previously in this context, a normal density estimator, and a kernel estimator. In our experiments, the latter two deliver better performance, both in terms of cross-validated log-likelihood and in terms of quality of the resulting prediction intervals. The empirical coverage of the intervals is close to the desired confidence level in most cases. We also include results for point estimation, as well as a comparison to Gaussian process regression and nonparametric quantile estimation.