Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Complex-Valued Neural Networks (Studies in Computational Intelligence)
Complex-Valued Neural Networks (Studies in Computational Intelligence)
Error Entropy in Classification Problems: A Univariate Data Analysis
Neural Computation
Complex-valued neural networks: the merits and their origins
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Complex Valued Nonlinear Adaptive Filters: Noncircularity, Widely Linear and Neural Models
Complex Valued Nonlinear Adaptive Filters: Noncircularity, Widely Linear and Neural Models
The mee principle in data classification: A perceptron-based analysis
Neural Computation
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Neural networks trained with the EEM algorithm: tuning the smoothing parameter
NN'05 Proceedings of the 6th WSEAS international conference on Neural networks
Error entropy minimization for LSTM training
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
An error-entropy minimization algorithm for supervised training ofnonlinear adaptive systems
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
This paper presents the adaptation of a single layer complex valued neural network (NN) to use entropy in the cost function instead of the usual mean squared error (MSE). This network has the good property of having only one layer so that there is no need to search for the number of hidden layer neurons: the topology is completely determined by the problem. We extend the existing stochastic MSE based learning algorithm to a batch MSE version first and then to a batch minimum error entropy (MEE). We present experiments showing the the proposed algorithms are competitive with other learning machines.