Function Estimation by Feedforward Sigmoidal Networks with BoundedWeights
Neural Processing Letters
Nonparametric Time Series Prediction Through Adaptive ModelSelection
Machine Learning
On Fusers that Perform Better than Best Sensor
IEEE Transactions on Pattern Analysis and Machine Intelligence
Prediction from randomly right censored data
Journal of Multivariate Analysis
Model Selection and Error Estimation
Machine Learning
Theoretical Properties of Projection Based Multilayer Perceptrons with Functional Inputs
Neural Processing Letters
Almost Linear VC-Dimension Bounds for Piecewise Polynomial Networks
Neural Computation
Rates of convergence for partitioning and nearest neighbor regression estimates with unbounded data
Journal of Multivariate Analysis
Asymptotic confidence intervals for Poisson regression
Journal of Multivariate Analysis
Pattern Recognition for Conditionally Independent Data
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Nonlinear Function Learning Using Radial Basis Function Networks: Convergence and Rates
ICAISC '08 Proceedings of the 9th international conference on Artificial Intelligence and Soft Computing
MAD Loss in Pattern Recognition and RBF Learning
ICAISC '08 Proceedings of the 9th international conference on Artificial Intelligence and Soft Computing
Consistency of functional learning methods based on derivatives
Pattern Recognition Letters
Nonlinear function learning by the normalized radial basis function networks
ICAISC'06 Proceedings of the 8th international conference on Artificial Intelligence and Soft Computing
Computational Statistics & Data Analysis
Efficient regression in metric spaces via approximate lipschitz extension
SIMBAD'13 Proceedings of the Second international conference on Similarity-Based Pattern Recognition
Hi-index | 754.84 |
A general notion of universal consistency of nonparametric estimators is introduced that applies to regression estimation, conditional median estimation, curve fitting, pattern recognition, and learning concepts. General methods for proving consistency of estimators based on minimizing the empirical error are shown. In particular, distribution-free almost sure consistency of neural network estimates and generalized linear estimators is established