Training with noise is equivalent to Tikhonov regularization
Neural Computation
Neuro-fuzzy and soft computing: a computational approach to learning and machine intelligence
Neuro-fuzzy and soft computing: a computational approach to learning and machine intelligence
Knowledge-based neurocomputing
Knowledge-based neurocomputing
Rule Extraction from Recurrent Neural Networks: A Taxonomy and Review
Neural Computation
Singularities Affect Dynamics of Learning in Neuromanifolds
Neural Computation
Recurrent neuro-fuzzy hybrid-learning approach to accurate system modeling
Fuzzy Sets and Systems
Knowledge-Based Neurocomputing: A Fuzzy Logic Approach
Knowledge-Based Neurocomputing: A Fuzzy Logic Approach
IEEE Transactions on Neural Networks
Training neural networks with additive noise in the desired signal
IEEE Transactions on Neural Networks
Are artificial neural networks white boxes?
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
An important problem in learning using gradient descent algorithms (such as backprop) is the slowdown incurred by temporary minima (TM). We consider this problem for an artificial neural network trained to solve the XOR problem. The network is transformed into the equivalent all permutations fuzzy rule-base which provides a symbolic representation of the knowledge embedded in the network. We develop a mathematical model for the evolution of the fuzzy rule-base parameters during learning in the vicinity of TM. We show that the rule-base becomes singular and tends to remain singular in the vicinity of TM. Our analysis suggests a simple remedy for overcoming the slowdown in the learning process incurred by TM. This is based on slightly perturbing the values of the training examples, so that they are no longer symmetric. Simulations demonstrate the usefulness of this approach.