Exponential bounds of mean error for the nearest neighbor estimates of regression functions
Journal of Multivariate Analysis
Boundary effects on convergence rates for Tikhonov regularization
Journal of Approximation Theory
Support vector machines are universally consistent
Journal of Complexity
Ridge Regression Learning Algorithm in Dual Variables
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Theoretical Properties of Projection Based Multilayer Perceptrons with Functional Inputs
Neural Processing Letters
Representation of functional data in neural networks
Neurocomputing
Support vector machine for functional data classification
Neurocomputing
Extracting motion primitives from natural handwriting data
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
On the continuity of the error distortion function for multiple-hypothesis decisions (Corresp.)
IEEE Transactions on Information Theory
Functional classification in Hilbert spaces
IEEE Transactions on Information Theory
Nonparametric estimation via empirical risk minimization
IEEE Transactions on Information Theory
A review of Bayesian neural networks with an application to near infrared spectroscopy
IEEE Transactions on Neural Networks
How to quantitatively compare data dissimilarities for unsupervised machine learning?
ANNPR'12 Proceedings of the 5th INNS IAPR TC 3 GIRPR conference on Artificial Neural Networks in Pattern Recognition
Kernel robust soft learning vector quantization
ANNPR'12 Proceedings of the 5th INNS IAPR TC 3 GIRPR conference on Artificial Neural Networks in Pattern Recognition
Journal of Multivariate Analysis
Learning vector quantization for (dis-)similarities
Neurocomputing
Hi-index | 0.10 |
In some real world applications, such as spectrometry, functional models achieve better predictive performances if they work on the derivatives of order m of their inputs rather than on the original functions. As a consequence, the use of derivatives is a common practice in Functional Data Analysis, despite a lack of theoretical guarantees on the asymptotically achievable performances of a derivative based model. In this paper, we show that a smoothing spline approach can be used to preprocess multivariate observations obtained by sampling functions on a discrete and finite sampling grid in a way that leads to a consistent scheme on the original infinite dimensional functional problem. This work extends (Mas and Pumo, 2009) to nonparametric approaches and incomplete knowledge. To be more precise, the paper tackles two difficulties in a nonparametric framework: the information loss due to the use of the derivatives instead of the original functions and the information loss due to the fact that the functions are observed through a discrete sampling and are thus also unperfectly known: the use of a smoothing spline based approach solves these two problems. Finally, the proposed approach is tested on two real world datasets and the approach is experimentaly proven to be a good solution in the case of noisy functional predictors.