Regularization theory and neural networks architectures
Neural Computation
An introduction to genetic algorithms
An introduction to genetic algorithms
A Theory of Networks for Approximation and Learning
A Theory of Networks for Approximation and Learning
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Sum and product kernel regularization networks
ICAISC'06 Proceedings of the 8th international conference on Artificial Intelligence and Soft Computing
Hi-index | 0.00 |
Regularization networks represent an important supervised learning method applicable for regression and classification tasks. They benefit from very good theoretical background, although the presence of meta parameters is their drawback. The meta parameters, including the type of kernel function, are typically supposed to be given in advance and come ready as an input of the algorithm. In this paper, we propose multi-kernel functions, namely product kernel functions and composite kernel functions. The choice of kernel function becomes part of the optimization process, for which a new evolutionary learning algorithm is introduced that deals with different kernel functions, including composite kernels. The results are demonstrated on experiments with benchmark tasks.