Matrix analysis
Neural Computation
Neural networks for pattern recognition
Neural networks for pattern recognition
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Optimization of ann applied to non-linear system identification
MIC'06 Proceedings of the 25th IASTED international conference on Modeling, indentification, and control
Pegasos: Primal Estimated sub-GrAdient SOlver for SVM
Proceedings of the 24th international conference on Machine learning
Monocular Pedestrian Detection: Survey and Experiments
IEEE Transactions on Pattern Analysis and Machine Intelligence
Novel maximum-margin training algorithms for supervised neural networks
IEEE Transactions on Neural Networks
Semi-Supervised Learning
Genetic algorithm–based training for semi-supervised SVM
Neural Computing and Applications
IEEE Transactions on Information Theory
Exploring constructive cascade networks
IEEE Transactions on Neural Networks
SVM-Based Tree-Type Neural Networks as a Critic in Adaptive Critic Designs for Control
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
This paper proposes two new training algorithms for multilayer perceptrons based on evolutionary computation, regularization, and transduction. Regularization is a commonly used technique for preventing the learning algorithm from overfitting the training data. In this context, this work introduces and analyzes a novel regularization scheme for neural networks (NNs) named eigenvalue decay, which aims at improving the classification margin. The introduction of eigenvalue decay led to the development of a new training method based on the same principles of SVM, and so named Support Vector NN (SVNN). Finally, by analogy with the transductive SVM (TSVM), it is proposed a transductive NN (TNN), by exploiting SVNN in order to address transductive learning. The effectiveness of the proposed algorithms is evaluated on seven benchmark datasets.