Real and complex analysis, 3rd ed.
Real and complex analysis, 3rd ed.
Approximation capabilities of multilayer feedforward networks
Neural Networks
Some new results on neural network approximation
Neural Networks
Simultaneous non-parametric regressions of unbalanced longitudinal data
Computational Statistics & Data Analysis
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Statistical modelling of functional data: Research Articles
Applied Stochastic Models in Business and Industry - Statistical Learning
IEEE Transactions on Neural Networks
Theoretical Properties of Projection Based Multilayer Perceptrons with Functional Inputs
Neural Processing Letters
Dual features functional support vector machines for fault detection of rechargeable batteries
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews - Special issue on information reuse and integration
Representation of functional data in neural networks
Neurocomputing
Support vector machine for functional data classification
Neurocomputing
Support vector regression methods for functional data
CIARP'07 Proceedings of the Congress on pattern recognition 12th Iberoamerican conference on Progress in pattern recognition, image analysis and applications
KES-AMSTA'08 Proceedings of the 2nd KES International conference on Agent and multi-agent systems: technologies and applications
Self-organizing multilayer perceptron
IEEE Transactions on Neural Networks
A functional density-based nonparametric approach for statistical calibration
CIARP'10 Proceedings of the 15th Iberoamerican congress conference on Progress in pattern recognition, image analysis, computer vision, and applications
Consistency of functional learning methods based on derivatives
Pattern Recognition Letters
Functional data analysis in shape analysis
Computational Statistics & Data Analysis
Hi-index | 0.00 |
In this paper, we study a natural extension of multi-layer perceptrons (MLP) to functional inputs. We show that fundamental results for classical MLP can be extended to functional MLP. We obtain universal approximation results that show the expressive power of functional MLP is comparable to that of numerical MLP. We obtain consistency results, which imply that the estimation of optimal parameters for functional MLP is statistically well defined. We finally show on simulated and real world data that the proposed model performs in a very satisfactory way.