Multilayer feedforward networks are universal approximators
Neural Networks
Information Sciences—Applications: An International Journal
Fully Complex Multi-Layer Perceptron Network for Nonlinear Signal Processing
Journal of VLSI Signal Processing Systems
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Complex-Valued Neural Networks (Studies in Computational Intelligence)
Complex-Valued Neural Networks (Studies in Computational Intelligence)
The complex backpropagation algorithm
IEEE Transactions on Signal Processing
Periodic activation function and a modified learning algorithm for the multivalued neuron
IEEE Transactions on Neural Networks
A novel signal diagnosis technique using pseudo complex-valued autoregressive technique
Expert Systems with Applications: An International Journal
Fast learning fully complex-valued classifiers for real-valued classification problems
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part I
Single layer complex valued neural network with entropic cost function
ICANN'11 Proceedings of the 21th international conference on Artificial neural networks - Volume Part I
Information Sciences: an International Journal
Real-time hand gesture recognition using complex-valued neural network (CVNN)
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part I
Complex-Valued neuro-fuzzy inference system based classifier
SEMCCO'12 Proceedings of the Third international conference on Swarm, Evolutionary, and Memetic Computing
Hi-index | 0.01 |
This paper presents a model of complex-valued neuron (CVN) for real-valued classification problems, introducing two new activation functions. In this CVN model, each real-valued input is encoded into a phase between 0 and @p of a complex number of unity magnitude, and multiplied by a complex-valued weight. The weighted sum of inputs is then fed to an activation function. Both the proposed activation functions map complex values into real values, and their role is to divide the net-input (weighted sum) space into multiple regions representing the classes of input patterns. Gradient-based learning rules are derived for each of the activation functions. The ability of such CVN is discussed and tested with two-class problems, such as two- and three-input Boolean problems, and the symmetry detection in binary sequences. We show here that the CVN with both activation functions can form proper boundaries for these linear and nonlinear problems. For solving n-class problems, a complex-valued neural network (CVNN) consisting of n CVNs is also studied. We defined the one exhibiting the largest output among all the neurons as representing the output class. We tested such single-layered CVNNs on several real world benchmark problems. The results show that the classification ability of single-layered CVNN on unseen data is comparable to the conventional real-valued neural network (RVNN) having one hidden layer. Moreover, convergence of the CVNN is much faster than that of the RVNN in most cases.