The nature of statistical learning theory
The nature of statistical learning theory
Machine learning, neural and statistical classification
Machine learning, neural and statistical classification
Self-Organizing Maps
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Dynamics and Generalization Ability of LVQ Algorithms
The Journal of Machine Learning Research
A Comparison of Methods for Learning of Highly Non-separable Problems
ICAISC '08 Proceedings of the 9th international conference on Artificial Intelligence and Soft Computing
Projection Pursuit Constructive Neural Networks Based on Quality of Projected Clusters
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
Learning highly non-separable Boolean functions using constructive feedforward neural network
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Fast projection pursuit based on quality of projected clusters
ICANNGA'11 Proceedings of the 10th international conference on Adaptive and natural computing algorithms - Volume Part II
Hi-index | 0.00 |
Neural networks and other sophisticated machine learning algorithms frequently miss simple solutions that can be discovered by a more constrained learning methods. Transition from a single neuron solving linearly separable problems, to multithreshold neuron solving k -separable problems, to neurons implementing prototypes solving q -separable problems, is investigated. Using Learning Vector Quantization (LVQ) approach this transition is presented as going from two prototypes defining a single hyperplane, to many co-linear prototypes defining parallel hyperplanes, to unconstrained prototypes defining Voronoi tessellation. For most datasets relaxing the co-linearity condition improves accuracy increasing complexity of the model, but for data with inherent logical structure LVQ algorithms with constraints significantly outperforms original LVQ and many other algorithms.