Eigenvalues and s-numbers
Universal approximation using radial-basis-function networks
Neural Computation
Approximation and radial-basis-function networks
Neural Computation
Regularization theory and neural networks architectures
Neural Computation
Machine Learning
An equivalence between sparse approximation and support vector machines
Neural Computation
Feedforward Neural Network Methodology
Feedforward Neural Network Methodology
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Aggregation of SVM Classifiers Using Sobolev Spaces
The Journal of Machine Learning Research
Complexity of Gaussian-radial-basis networks approximating smooth functions
Journal of Complexity
On the exponential convergence of matching pursuits in quasi-incoherent dictionaries
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Two types of computational models, radial-basis function networks with units having varying widths and kernel networks where all units have a fixed width, are investigated in the framework of scaled kernels. The impact of widths of kernels on approximation of multivariable functions, generalization modelled by regularization with kernel stabilizers, and minimization of error functionals is analyzed.