Matrix analysis
Simplifying neural networks by soft weight-sharing
Neural Computation
Numerical recipes in C (2nd ed.): the art of scientific computing
Numerical recipes in C (2nd ed.): the art of scientific computing
Bayesian regularization and pruning using a Laplace prior
Neural Computation
Matrix computations (3rd ed.)
Bayesian learning for neural networks
Bayesian learning for neural networks
Adaptive mixtures of local experts
Neural Computation
Bias/variance decompositions for likelihood-based estimators
Neural Computation
Error Functions for Prediction of Episodes of Poor Air Quality
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Most likely heteroscedastic Gaussian process regression
Proceedings of the 24th international conference on Machine learning
Conditional Density Estimation with Class Probability Estimators
ACML '09 Proceedings of the 1st Asian Conference on Machine Learning: Advances in Machine Learning
Statistical models to assess the health effects and to forecast ground-level ozone
Environmental Modelling & Software
Estimating predictive variances with kernel ridge regression
MLCW'05 Proceedings of the First international conference on Machine Learning Challenges: evaluating Predictive Uncertainty Visual Object Classification, and Recognizing Textual Entailment
Hi-index | 0.00 |
Neural network outputs are interpreted as parameters of statistical distributions. This allows us to fit conditional distributions in which the parameters depend on the inputs to the network. We exploit this in modeling multivariate data, including the univariate case, in which there may be input-dependent (e.g., time-dependent) correlations between output components. This provides a novel way of modeling conditional correlation that extends existing techniques for determining input-dependent (local) error bars.