Neural networks and the bias/variance dilemma
Neural Computation
Regularization theory and neural networks architectures
Neural Computation
Multi-Objective Machine Learning (Studies in Computational Intelligence) (Studies in Computational Intelligence)
IEEE Transactions on Neural Networks
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
Gradient descent decomposition for multi-objective learning
IDEAL'11 Proceedings of the 12th international conference on Intelligent data engineering and automated learning
Regularized online sequential learning algorithm for single-hidden layer feedforward neural networks
Pattern Recognition Letters
Multi-objective hybrid evolutionary algorithms for radial basis function neural network design
Knowledge-Based Systems
Information Sciences: an International Journal
Hi-index | 0.01 |
The problem of inductive supervised learning is discussed in this paper within the context of multi-objective (MOBJ) optimization. The smoothness-based apparent (effective) complexity measure for RBF networks is considered. For the specific case of RBF network, bounds on the complexity measure are formally described. As the synthetic and real-world data experiments show, the proposed MOBJ learning method is capable of efficient generalization control along with network size reduction.