Machine Learning
Initializing neural networks using decision trees
Computational learning theory and natural learning systems: Volume IV
Geometrical synthesis of MLP neural networks
Neurocomputing
Hi-index | 0.00 |
In this paper we present Border pairs method, a constructive learning algorithm for multilayer perceptron (MLP). During learning with this method a near-minimal network architecture is found. MLP learning is conducted separately by individual layers and neurons. The algorithm is tested in computer simulation with simple learning patterns (XOR and triangles image), with traditional learning patterns (Iris and MNIST) and with noisy learning patterns. During the learning we have less possibilities to get stuck in the local minima, generalization of learning is good. Learning with noisy, multi-dimensional and numerous learning patterns work well. The Border pairs method also supports incremental learning.