Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Boosting and other ensemble methods
Neural Computation
Stacked generalization: when does it work?
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Issues in stacked generalization
Journal of Artificial Intelligence Research
An experimental study on training radial basis functions by gradient descent
ANNPR'06 Proceedings of the Second international conference on Artificial Neural Networks in Pattern Recognition
Combining MF networks: a comparison among statistical methods and stacked generalization
ANNPR'06 Proceedings of the Second international conference on Artificial Neural Networks in Pattern Recognition
Reformulated radial basis neural networks trained by gradient descent
IEEE Transactions on Neural Networks
On the construction and training of reformulated radial basis function neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The performance of a Radial Basis Functions network (RBF) can be increased with the use of an ensemble of RBF networks because the RBF networks are successfully applied to solve classification problems and they can be trained by gradient descent algorithms. Reviewing the bibliography we can see that the performance of ensembles of Multilayer Feedforward (MF) networks can be improved by the use of the two combination methods based on Stacked Generalization described in [1]. We think that we could get a better classification system if we applied these combiners to an RBF ensemble. In this paper we satisfactory apply these two new methods, Stacked and Stacked+, on ensembles of RBF networks. Increasing the number of networks used in the combination module is also successfully proposed in this paper. The results show that training 3 MF networks to combine an RBF ensemble is the best alternative.