Approximation and radial-basis-function networks
Neural Computation
Artificial Intelligence Review - Special issue on lazy learning
An efficient MDL-based construction of RBF networks
Neural Networks
Fast learning in networks of locally-tuned processing units
Neural Computation
Input selection for radial basis function networks by constrained optimization
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
A radial basis function redesigned for predicting a welding process
MICAI'10 Proceedings of the 9th Mexican international conference on Artificial intelligence conference on Advances in soft computing: Part II
Analysis and evaluation in a welding process applying a Redesigned Radial Basis Function
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
Radial Basis Neural Networks (RBNN) can approximate any regular function and have a faster training phase than other similar neural networks. However, the activation of each neuron depends on the euclidean distance between a pattern and the neuron center. Therefore, the activation function is symmetrical and all attributes are considered equally relevant. This could be solved by altering the metric used in the activation function (i.e. using non-symmetrical metrics). The Mahalanobis distance is such a metric, that takes into account the variability of the attributes and their correlations. However, this distance is computed directly from the variance-covariance matrix and does not consider the accuracy of the learning algorithm. In this paper, we propose to use a generalized euclidean metric, following the Mahalanobis structure, but evolved by a Genetic Algorithm (GA). This GA searches for the distance matrix that minimizes the error produced by a fixed RBNN. Our approach has been tested on two domains and positive results have been observed in both cases.