Approximating networks and extended Ritz method for the solution of functional optimization problems
Journal of Optimization Theory and Applications
Big Omicron and big Omega and big Theta
ACM SIGACT News
Journal of Approximation Theory
Complexity of Gaussian-radial-basis networks approximating smooth functions
Journal of Complexity
Accuracy of suboptimal solutions to kernel principal component analysis
Computational Optimization and Applications
Approximate Minimization of the Regularized Expected Error over Kernel Models
Mathematics of Operations Research
On tractability of neural-network approximation
ICANNGA'09 Proceedings of the 9th international conference on Adaptive and natural computing algorithms
Some comparisons of model complexity in linear and neural-network approximation
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part III
Bounds on rates of variable-basis and neural-network approximation
IEEE Transactions on Information Theory
Comparison of worst case errors in linear and neural network approximation
IEEE Transactions on Information Theory
On the exponential convergence of matching pursuits in quasi-incoherent dictionaries
IEEE Transactions on Information Theory
Geometric Upper Bounds on Rates of Variable-Basis Approximation
IEEE Transactions on Information Theory
Universal approximation bounds for superpositions of a sigmoidal function
IEEE Transactions on Information Theory
On the geometric convergence of neural approximations
IEEE Transactions on Neural Networks
Some comparisons of networks with radial and kernel units
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Hi-index | 0.00 |
Approximation capabilities of two types of computational models are explored: dictionary-based models (i.e., linear combinations of n-tuples of basis functions computable by units belonging to a set called ''dictionary'') and linear ones (i.e., linear combinations of n fixed basis functions). The two models are compared in terms of approximation rates, i.e., speeds of decrease of approximation errors for a growing number n of basis functions. Proofs of upper bounds on approximation rates by dictionary-based models are inspected, to show that for individual functions they do not imply estimates for dictionary-based models that do not hold also for some linear models. Instead, the possibility of getting faster approximation rates by dictionary-based models is demonstrated for worst-case errors in approximation of suitable sets of functions. For such sets, even geometric upper bounds hold.