Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Complex backpropagation neural network using elementary transcendental activation functions
ICASSP '01 Proceedings of the Acoustics, Speech, and Signal Processing, 200. on IEEE International Conference - Volume 02
IEEE Transactions on Signal Processing
Nonlinear blind equalization schemes using complex-valued multilayer feedforward neural networks
IEEE Transactions on Neural Networks
Approximation by fully complex multilayer perceptrons
Neural Computation
Complex Infomax: Convergence and Approximation of Infomax with Complex Nonlinearities
Journal of VLSI Signal Processing Systems
Symbol decision equalizer using a radial basis functions neural network
NN'06 Proceedings of the 7th WSEAS International Conference on Neural Networks
Complex-valued adaptive signal processing using nonlinear functions
EURASIP Journal on Advances in Signal Processing
Complex-valued function approximation using a fully complex-valued RBF (FC-RBF) learning algorithm
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
A division algebraic framework for multidimensional support vector regression
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Channel equalization using neural networks: a review
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Advances in Artificial Neural Systems
A novel signal diagnosis technique using pseudo complex-valued autoregressive technique
Expert Systems with Applications: An International Journal
Fast learning fully complex-valued classifiers for real-valued classification problems
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part I
Historical consistent complex valued recurrent neural network
ICANN'11 Proceedings of the 21th international conference on Artificial neural networks - Volume Part I
Fixed points of complex-valued bidirectional associative memory
Journal of Computational and Applied Mathematics
Information Sciences: an International Journal
ACC'11/MMACTEE'11 Proceedings of the 13th IASME/WSEAS international conference on Mathematical Methods and Computational Techniques in Electrical Engineering conference on Applied Computing
Complex-Valued neuro-fuzzy inference system based classifier
SEMCCO'12 Proceedings of the Third international conference on Swarm, Evolutionary, and Memetic Computing
Hi-index | 0.00 |
Designing a neural network (NN) to process complex-valued signals is a challenging task since a complex nonlinear activation function (AF) cannot be both analytic and bounded everywhere in the complex plane {\bb C}. To avoid this difficulty, ‘splitting’, i.e., using a pair of real sigmoidal functions for the real and imaginary components has been the traditional approach. However, this ‘ad hoc’ compromise to avoid the unbounded nature of nonlinear complex functions results in a nowhere analytic AF that performs the error back-propagation (BP) using the split derivatives of the real and imaginary components instead of relying on well-defined fully complex derivatives. In this paper, a fully complex multi-layer perceptron (MLP) structure that yields a simplified complex-valued back-propagation (BP) algorithm is presented. The simplified BP verifies that the fully complex BP weight update formula is the complex conjugate form of real BP formula and the split complex BP is a special case of the fully complex BP. This generalization is possible by employing elementary transcendental functions (ETFs) that are almost everywhere (a.e.) bounded and analytic in {\bb C}. The properties of fully complex MLP are investigated and the advantage of ETFs over split complex AF is shown in numerical examples where nonlinear magnitude and phase distortions of non-constant modulus modulated signals are successfully restored.