Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Robust Cross-Validation Score Function for Non-linear Function Estimation
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
The evidence framework applied to classification networks
Neural Computation
IEEE Transactions on Neural Networks
The Concentration of Fractional Distances
IEEE Transactions on Knowledge and Data Engineering
On Nonparametric Residual Variance Estimation
Neural Processing Letters
Residual variance estimation in machine learning
Neurocomputing
Regularized Discriminant Analysis, Ridge Regression and Beyond
The Journal of Machine Learning Research
Hi-index | 0.01 |
Model-free estimates of the noise variance are important in model selection and setting tuning parameters. In this paper a data representation is discussed which leads to such an estimator suitable for multivariate data. Its visual representation-called the differogram cloud here-is based on the 2-norm of the differences of input and output data. The crucial concept of locality in this representation is translated as the increasing variance of the difference, which does not rely explicitly on an extra hyper-parameter. Connections with U-statistics, Taylor series expansions and other related methods are given. Numerical simulations indicate a convergence of the estimator. This paper extends results towards a time-dependent setting and to the case of non-Gaussian noise models or outliers. As an application, this paper focuses on model selection for Least Squares Support Vector Machines. For this purpose, a variant of the LS-SVM regressor is derived based on Morozov's discrepancy principle relating the regularization constant directly with the (observed) noise level.