Training with noise is equivalent to Tikhonov regularization
Neural Computation
Neuro-fuzzy and soft computing: a computational approach to learning and machine intelligence
Neuro-fuzzy and soft computing: a computational approach to learning and machine intelligence
Knowledge-based neurocomputing
Knowledge-based neurocomputing
Rule Extraction from Recurrent Neural Networks: A Taxonomy and Review
Neural Computation
Singularities Affect Dynamics of Learning in Neuromanifolds
Neural Computation
Extracting symbolic knowledge from recurrent neural networks---A fuzzy logic approach
Fuzzy Sets and Systems
Knowledge-Based Neurocomputing: A Fuzzy Logic Approach
Knowledge-Based Neurocomputing: A Fuzzy Logic Approach
Are fuzzy sets a reasonable tool for modeling vague phenomena?
Fuzzy Sets and Systems
Biomimicry and Fuzzy Modeling: A Match Made in Heaven
IEEE Computational Intelligence Magazine
IEEE Computational Intelligence Magazine
Fuzzy logic = computing with words
IEEE Transactions on Fuzzy Systems
Designing fuzzy inference systems from data: An interpretability-oriented review
IEEE Transactions on Fuzzy Systems
Fuzzy-set based models of neurons and knowledge-based networks
IEEE Transactions on Fuzzy Systems
Are artificial neural networks black boxes?
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Training neural networks with additive noise in the desired signal
IEEE Transactions on Neural Networks
Extracting rules from trained neural networks
IEEE Transactions on Neural Networks
Neuro-fuzzy rule generation: survey in soft computing framework
IEEE Transactions on Neural Networks
Are artificial neural networks white boxes?
IEEE Transactions on Neural Networks
Multilayer perceptron, fuzzy sets, and classification
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Fuzzy multi-layer perceptron, inferencing and rule generation
IEEE Transactions on Neural Networks
Hi-index | 0.20 |
Artificial neural networks (ANNs) are often trained using gradient descent algorithms (such as backpropagation). An important problem in the learning process is the slowdown incurred by temporary minima (TM). We analyze this problem for an ANN trained to solve the Exclusive Or problem. The network is transformed into the equivalent all permutations fuzzy rule-base (FARB), which provides a symbolic representation of the knowledge embedded in the network, after each learning step. We develop a mathematical model for the evolution of the fuzzy rule-base parameters during learning in the vicinity of TM. We show that the rule-base becomes singular and tends to remain singular in the vicinity of TM. The analysis of the fuzzy rule-base suggests a simple remedy for overcoming the slowdown in the learning process incurred by TM. This is based on slightly perturbing the desired output values in the training examples, so that they are no longer symmetric. Simulations demonstrate the effectiveness of this approach in reducing the time spent in the vicinity of TM.