Neural networks and the bias/variance dilemma
Neural Computation
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
VC dimension and uniform learnability of sparse polynomials and rational functions
SIAM Journal on Computing
Sphere packing numbers for subsets of the Boolean n-cube with bounded Vapnik-Chervonenkis dimension
Journal of Combinatorial Theory Series A
The nature of statistical learning theory
The nature of statistical learning theory
Almost linear VC-dimension bounds for piecewise polynomial networks
Neural Computation
Model Selection and Error Estimation
Machine Learning
The covering number in learning theory
Journal of Complexity
Comparison of model selection for regression
Neural Computation
Covering number bounds of certain regularized linear function classes
The Journal of Machine Learning Research
Learning from Data: Concepts, Theory, and Methods
Learning from Data: Concepts, Theory, and Methods
Another look at statistical learning theory and regularization
Neural Networks
Optimal learning rates for least squares regularized regression with unbounded sampling
Journal of Complexity
Universal approximation bounds for superpositions of a sigmoidal function
IEEE Transactions on Information Theory
Radial basis function networks and complexity regularization in function learning
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper, the problem of learning the functional dependency between input and output variables from scattered data using fractional polynomial models (FPM) is investigated. The estimation error bounds are obtained by calculating the pseudo-dimension of FPM, which is shown to be equal to that of sparse polynomial models (SPM). A linear decay of the approximation error is obtained for a class of target functions which are dense in the space of continuous functions. We derive a structural risk analogous to the Schwartz Criterion and demonstrate theoretically that the model minimizing this structural risk can achieve a favorable balance between estimation and approximation errors. An empirical model selection comparison is also performed to justify the usage of this structural risk in selecting the optimal complexity index from the data. We show that the construction of FPM can be efficiently addressed by the variable projection method. Furthermore, our empirical study implies that FPM could attain better generalization performance when compared with SPM and cubic splines.