Average-case learning curves for radial basis function networks
Neural Computation
Hybrid learning of mapping and its Jacobian in multilayer neural networks
Neural Computation
Online learning in radial basis function networks
Neural Computation
Natural discriminant analysis using interactive Potts models
Neural Computation
On different facets of regularization theory
Neural Computation
A Sensitivity Clustering Method for Hybrid Evolutionary Algorithms
IWINAC '09 Proceedings of the 3rd International Work-Conference on The Interplay Between Natural and Artificial Computation: Part I: Methods and Models in Artificial and Natural Computation. A Homage to Professor Mira's Scientific Legacy
A predictive and probabilistic load-balancing algorithm for cluster-based web servers
Applied Soft Computing
A dynamic over-sampling procedure based on sensitivity for multi-class problems
Pattern Recognition
Evolutionary q-gaussian radial basis functions for binary-classification
HAIS'10 Proceedings of the 5th international conference on Hybrid Artificial Intelligence Systems - Volume Part II
Hi-index | 0.02 |
The two-layer radial basis function network, with fixed centersof the basis functions, is analyzed within a stochastic trainingparadigm. Various definitions of generalization error areconsidered, and two such definitions are employed in derivinggeneric learning curves and generalization properties, both withand without a weight decay term. The generalization error is shownanalytically to be related to the evidence and, via the evidence,to the prediction error and free energy. The generalizationbehavior is explored; the generic learning curve is found to beinversely proportional to the number of training pairs presented.Optimization of training is considered by minimizing thegeneralization error with respect to the free parameters of thetraining algorithms. Finally, the effect of the joint activationsbetween hidden-layer units is examined and shown to speedtraining.