Universal approximation using radial-basis-function networks
Neural Computation
Pedagogical pattern selection strategies
Neural Networks
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Making large-scale support vector machine learning practical
Advances in kernel methods
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Generalization and Selection of Examples in Feedforward Neural Networks
Neural Computation
Neural Computation
Fast learning in networks of locally-tuned processing units
Neural Computation
Boosting by weighting critical and erroneous samples
Neurocomputing
Locally Trained Piecewise Linear Classifiers
IEEE Transactions on Pattern Analysis and Machine Intelligence
The training of neural classifiers with condensed datasets
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
The condensed nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
Cost functions to estimate a posteriori probabilities in multiclass problems
IEEE Transactions on Neural Networks
Sample selection via clustering to construct support vector-like classifiers
IEEE Transactions on Neural Networks
Nonlinear kernel-based statistical pattern analysis
IEEE Transactions on Neural Networks
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
The multilayer perceptron as an approximation to a Bayes optimal discriminant function
IEEE Transactions on Neural Networks
Bayes statistical behavior and valid generalization of pattern classifying neural networks
IEEE Transactions on Neural Networks
Selecting concise training sets from clean data
IEEE Transactions on Neural Networks
Decision-based neural networks with signal/image classification applications
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Reducing the level of the targets corresponding to training samples for a machine classifier using the outputs of an auxiliary classifier is interesting because it allows to save expressive power unnecessarily dedicated to increase the output level of well-classified samples. In this paper we propose an iterative form of this selective reduction of target levels with a simple linear reduction schedule. Extensive simulations show that the proposed method has not only a performance better than or equal to conventional training or using static versions of the reduction, but also with respect to support vector machines (SVM). This potential advantage is accompanied by a smaller size and a design effort not much higher than the corresponding SVM, thus making the proposed method very attractive for practical applications.