Creating artificial neural networks that generalize
Neural Networks
Artificial intelligence: a modern approach
Artificial intelligence: a modern approach
Training with noise is equivalent to Tikhonov regularization
Neural Computation
Machine learning, neural and statistical classification
Machine learning, neural and statistical classification
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
LTF-C: architecture, training algorithm and applications of new neural classifier
Fundamenta Informaticae
Advances in Minimum Description Length: Theory and Applications (Neural Information Processing)
Advances in Minimum Description Length: Theory and Applications (Neural Information Processing)
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Hi-index | 0.00 |
The paper investigates modification of backpropagation algorithm, consisting of discretization of neural network weights after each training cycle. This modification, aimed at overfitting reduction, restricts the set of possible values of weights to a discrete subset of real numbers, leading to much better generalization abilities of the network. This, in turn, leads to higher accuracy and a decrease in error rate by over 50% in extreme cases (when overfitting is high).Discretization is performed nondeterministically, so as to keep expected value of discretized weight equal to original value. In this way, global behavior of original algorithm is preserved. The presented method of discretization is general and may be applied to other machine-learning algorithms. It is also an example of how an algorithm for continuous optimization can be successfully applied to optimization over discrete spaces. The method was evaluated experimentally in WEKA environment using two real-world data sets from UCI repository.