Elements of signal detection and estimation
Elements of signal detection and estimation
Principles of mobile communication (2nd ed.)
Principles of mobile communication (2nd ed.)
Approximation by fully complex multilayer perceptrons
Neural Computation
Complex-Valued Neural Networks (Studies in Computational Intelligence)
Complex-Valued Neural Networks (Studies in Computational Intelligence)
Complex-valued adaptive signal processing using nonlinear functions
EURASIP Journal on Advances in Signal Processing
Algorithms for complex ML ICA and their stability analysis using wirtinger calculus
IEEE Transactions on Signal Processing
Extension of Wirtinger's Calculus to Reproducing Kernel Hilbert Spaces and the Complex Kernel LMS
IEEE Transactions on Signal Processing
Complex-Valued multilayer perceptron search utilizing eigen vector descent and reducibility mapping
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Hi-index | 0.00 |
Complex-valued neural networks (CVNNs) bring in nonholomorphic functions in two ways: (i) through their loss functions and (ii) the widely used activation functions. The derivatives of such functions are defined in Wirtinger calculus. In this paper, we derive two popular algorithms—the gradient descent and the Levenberg-Marquardt (LM) algorithm—for parameter optimization in the feedforward CVNNs using the Wirtinger calculus, which is simpler than the conventional derivation that considers the problem in real domain. While deriving the LM algorithm, we solve and use the result of a least squares problem in the complex domain,$\|\mathbf{b-(Az+Bz^*)}\|_{\underset{\mathbf{z}}{\min}}$, which is more general than the $\|\mathbf{b-Az}\|_{\underset{\mathbf{z}}{\min}}$. Computer simulation results exhibit that as with the real-valued case, the complex-LM algorithm provides much faster learning with higher accuracy than the complex gradient descent algorithm. $|\mathbf{b-(Az+Bz^*)}\|_{\underset{\mathbf{z}}{\min}}$, which is more general than the $\|\mathbf{b-Az}\|_{\underset{\mathbf{z}}{\min}}$. Computer simulation results exhibit that as with the real-valued case, the complex-LM algorithm provides much faster learning with higher accuracy than the complex gradient descent algorithm. $|\mathbf{b-(Az+Bz^*)}\|_{\underset{\mathbf{z}}{\min}}$, which is more general than the $\|\mathbf{b-Az}\|_{\underset{\mathbf{z}}{\min}}$. Computer simulation results exhibit that as with the real-valued case, the complex-LM algorithm provides much faster learning with higher accuracy than the complex gradient descent algorithm. $|\mathbf{b-(Az+Bz^*)}\|_{\underset{\mathbf{z}}{\min}}$, which is more general than the $\|\mathbf{b-Az}\|_{\underset{\mathbf{z}}{\min}}$. Computer simulation results exhibit that as with the real-valued case, the complex-LM algorithm provides much faster learning with higher accuracy than the complex gradient descent algorithm.