Computing
Trainable videorealistic speech animation
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Modelling with implicit surfaces that interpolate
ACM Transactions on Graphics (TOG)
Reconstructing Surfaces by Volumetric Regularization Using Radial Basis Functions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel partial least squares regression in reproducing kernel hilbert space
The Journal of Machine Learning Research
Information Sciences: an International Journal - Special issue: Information technology
Modelling with implicit surfaces that interpolate
SIGGRAPH '05 ACM SIGGRAPH 2005 Courses
Surface reconstruction from sparse data by a multiscale volumetric approach
ISCGAV'05 Proceedings of the 5th WSEAS International Conference on Signal Processing, Computational Geometry & Artificial Vision
Sensing, Acquisition, and Interactive Playback of Data-based Models for Elastic Deformable Objects
International Journal of Robotics Research
Learning Translation Invariant Kernels for Classification
The Journal of Machine Learning Research
Trainable videorealistic speech animation
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
Model-based update in task-level feedforward control using on-line approximation
Automatica (Journal of IFAC)
Radial basis function regularization for linear inverse problems with random noise
Journal of Multivariate Analysis
Hi-index | 0.00 |
We had previously shown that regularization principles lead to approximation schemes, as Radial Basis Functions, which are equivalent to networks with one layer of hidden units, called Regularization Networks. In this paper we show that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models, Breiman''s hinge functions and some forms of Projection Pursuit Regression. In the probabilistic interpretation of regularization, the different classes of basis functions correspond to different classes of prior probabilities on the approximating function spaces, and therefore to different types of smoothness assumptions. In the final part of the paper, we also show a relation between activation functions of the Gaussian and sigmoidal type.