Relaxation by the Hopfield neural network
Pattern Recognition
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Fully Complex Multi-Layer Perceptron Network for Nonlinear Signal Processing
Journal of VLSI Signal Processing Systems
Orthogonality of decision boundaries in complex-valued neural networks
Neural Computation
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Risk-sensitive loss functions for sparse multi-category classification problems
Information Sciences: an International Journal
Automated assessment of breast tissue density in digital mammograms
Computer Vision and Image Understanding
Letters: Fully complex extreme learning machine
Neurocomputing
The computational power of complex-valued neuron
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Fast learning fully complex-valued classifiers for real-valued classification problems
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part I
Information Sciences: an International Journal
IEEE Transactions on Neural Networks
A Sequential Learning Algorithm for Complex-Valued Self-Regulating Resource Allocation Network-CSRAN
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Complex-Valued neuro-fuzzy inference system based classifier
SEMCCO'12 Proceedings of the Third international conference on Swarm, Evolutionary, and Memetic Computing
Hi-index | 0.00 |
This paper presents a meta-cognitive learning algorithm for a single hidden layer complex-valued neural network called ''Meta-cognitive Fully Complex-valued Relaxation Network (McFCRN)''. McFCRN has two components: a cognitive component and a meta-cognitive component. A Fully Complex-valued Relaxation Network (FCRN) with a fully complex-valued Gaussian like activation function (sech) in the hidden layer and an exponential activation function in the output layer forms the cognitive component. The meta-cognitive component contains a self-regulatory learning mechanism which controls the learning ability of FCRN by deciding what-to-learn, when-to-learn and how-to-learn from a sequence of training data. The input parameters of cognitive components are chosen randomly and the output parameters are estimated by minimizing a logarithmic error function. The problem of explicit minimization of magnitude and phase errors in the logarithmic error function is converted to system of linear equations and output parameters of FCRN are computed analytically. McFCRN starts with zero hidden neuron and builds the number of neurons required to approximate the target function. The meta-cognitive component selects the best learning strategy for FCRN to acquire the knowledge from training data and also adapts the learning strategies to implement best human learning components. Performance studies on a function approximation and real-valued classification problems show that proposed McFCRN performs better than the existing results reported in the literature.