Robust regression and outlier detection
Robust regression and outlier detection
Elements of information theory
Elements of information theory
The nature of statistical learning theory
The nature of statistical learning theory
Estimation of entropy and mutual information
Neural Computation
A Compression Approach to Support Vector Model Selection
The Journal of Machine Learning Research
Nonparametric Quantile Estimation
The Journal of Machine Learning Research
Consistency of kernel-based quantile regression
Applied Stochastic Models in Business and Industry
Mutual information and minimum mean-square error in Gaussian channels
IEEE Transactions on Information Theory
Hi-index | 754.84 |
This paper studies a distribution-free estimator of the conditional support and tolerance intervals of a distributions underlying a set of paired independent and identically distributed (i.i.d.) observations. The key ingredients are (a) an appropriate notion of risk which measures what probability mass is not captured by the estimate, (b) a uniform concentration inequality for the empirical risk based on a compression argument, and (c) the derivation of a lower bound to the mutual information, dictating how to maximize the informativeness of the estimator. For this result we extend Fano's inequality to the bivariate case.