Universal approximation using radial-basis-function networks
Neural Computation
Approximation and radial-basis-function networks
Neural Computation
Regularization theory and neural networks architectures
Neural Computation
An equivalence between sparse approximation and support vector machines
Neural Computation
Feedforward Neural Network Methodology
Feedforward Neural Network Methodology
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Aggregation of SVM Classifiers Using Sobolev Spaces
The Journal of Machine Learning Research
Complexity of Gaussian-radial-basis networks approximating smooth functions
Journal of Complexity
On the exponential convergence of matching pursuits in quasi-incoherent dictionaries
IEEE Transactions on Information Theory
Hi-index | 0.00 |
The role of width in kernel models and radial-basis function networks is investigated with a special emphasis on the Gaussian case. Quantitative bounds are given on kernel-based regularization showing the effect of changing the width. These bounds are shown to be d-th powers of width ratios, and so they are exponential in the dimension of input data.