Real and complex analysis, 3rd ed.
Real and complex analysis, 3rd ed.
Multilayer feedforward networks are universal approximators
Neural Networks
Neural networks for pattern recognition
Neural networks for pattern recognition
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Fully Complex Multi-Layer Perceptron Network for Nonlinear Signal Processing
Journal of VLSI Signal Processing Systems
Complex backpropagation neural network using elementary transcendental activation functions
ICASSP '01 Proceedings of the Acoustics, Speech, and Signal Processing, 200. on IEEE International Conference - Volume 02
IEEE Transactions on Signal Processing
Nonlinear blind equalization schemes using complex-valued multilayer feedforward neural networks
IEEE Transactions on Neural Networks
Complex-bilinear recurrent neural network for equalization of a digital satellite channel
IEEE Transactions on Neural Networks
A Complex-Valued RTRL Algorithm for Recurrent Neural Networks
Neural Computation
Complex Infomax: Convergence and Approximation of Infomax with Complex Nonlinearities
Journal of VLSI Signal Processing Systems
Sequential Data Fusion via Vector Spaces: Fusion of Heterogeneous Data in the Complex Domain
Journal of VLSI Signal Processing Systems
Letters: Convex incremental extreme learning machine
Neurocomputing
An augmented CRTRL for complex-valued recurrent neural networks
Neural Networks
Complex-valued adaptive signal processing using nonlinear functions
EURASIP Journal on Advances in Signal Processing
On Clifford neurons and Clifford multi-layer perceptrons
Neural Networks
Error minimized extreme learning machine with growth of hidden nodes and incremental learning
IEEE Transactions on Neural Networks
A Novel Evolving Clustering Algorithm with Polynomial Regression for Chaotic Time-Series Prediction
ICONIP '09 Proceedings of the 16th International Conference on Neural Information Processing: Part II
A constructive enhancement for online sequential extreme learning machine
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Complex-valued function approximation using a fully complex-valued RBF (FC-RBF) learning algorithm
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Letters: Fully complex extreme learning machine
Neurocomputing
Split quaternion nonlinear adaptive filtering
Neural Networks
Complex generalized-mean neuron model and its applications
Applied Soft Computing
PAKDD'11 Proceedings of the 15th Pacific-Asia conference on Advances in knowledge discovery and data mining - Volume Part II
Fixed points of complex-valued bidirectional associative memory
Journal of Computational and Applied Mathematics
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part I
Complex-Valued multilayer perceptron search utilizing eigen vector descent and reducibility mapping
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Hi-index | 0.00 |
We investigate the approximation ability of a multilayer perceptron (MLP) network when it is extended to the complex domain. The main challenge for processing complex data with neural networks has been the lack of bounded and analytic complex nonlinear activation functions in the complex domain, as stated by Liouville's theorem. To avoid the conflict between the boundedness and the analyticity of a nonlinear complex function in the complex domain, a number of ad hoc MLPs that include using two real-valued MLPs, one processing the real part and the other processing the imaginary part, have been traditionally employed. However, since nonanalytic functions do not meet the Cauchy-Riemann conditions, they render themselves into degenerative backpropagation algorithms that compromise the efficiency of nonlinear approximation and learning in the complex vector field. A number of elementary transcendental functions (ETFs) derivable from the entire exponential function ez that are analytic are defined as fully complex activation functions and are shown to provide a parsimonious structure for processing data in the complex domain and address most of the shortcomings of the traditional approach. The introduction of ETFs, however, raises a new question in the approximation capability of this fully complex MLP. In this letter, three proofs of the approximation capability of the fully complex MLP are provided based on the characteristics of singularity among ETFs. First, the fully complex MLPs with continuous ETFs over a compact set in the complex vector field are shown to be the universal approximator of any continuous complex mappings. The complex universal approximation theorem extends to bounded measurable ETFs possessing a removable singularity. Finally, it is shown that the output of complex MLPs using ETFs with isolated and essential singularities uniformly converges to any nonlinear mapping in the deleted annulus of singularity nearest to the origin.