Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Neural networks trained with the EEM algorithm: tuning the smoothing parameter
NN'05 Proceedings of the 6th WSEAS international conference on Neural networks
An error-entropy minimization algorithm for supervised training ofnonlinear adaptive systems
IEEE Transactions on Signal Processing
The mee principle in data classification: A perceptron-based analysis
Neural Computation
Hi-index | 0.00 |
The use of entropy as a cost function in the neural network learning phase usually implies that, in the back-propagation algorithm, the training is done in batch mode. Apart from the higher complexity of the algorithm in batch mode, we know that this approach has some limitations over the sequential mode. In this paper we present a way of combining both modes when using entropic criteria. We present some experiments that validates the proposed method and we also show some comparisons of this proposed method with the single batch mode algorithm.