A competitive modular connectionist architecture
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Pedagogical pattern selection strategies
Neural Networks
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Machine Learning
Game theory, on-line prediction and boosting
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
The new ERA in supervised learning
Neural Networks
Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
A family of algorithms for approximate bayesian inference
A family of algorithms for approximate bayesian inference
Generalization and Selection of Examples in Feedforward Neural Networks
Neural Computation
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Bayesian Gaussian Process Classification with the EM-EP Algorithm
IEEE Transactions on Pattern Analysis and Machine Intelligence
Assessing Approximate Inference for Binary Gaussian Process Classification
The Journal of Machine Learning Research
Neural Networks
Boosting by weighting critical and erroneous samples
Neurocomputing
Locally Trained Piecewise Linear Classifiers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Expectation propagation for approximate Bayesian inference
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
The training of neural classifiers with condensed datasets
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
The condensed nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
Moderating the outputs of support vector machine classifiers
IEEE Transactions on Neural Networks
Sample selection via clustering to construct support vector-like classifiers
IEEE Transactions on Neural Networks
Nonlinear kernel-based statistical pattern analysis
IEEE Transactions on Neural Networks
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
A Dynamically Adjusted Mixed Emphasis Method for Building Boosting Ensembles
IEEE Transactions on Neural Networks
Selecting concise training sets from clean data
IEEE Transactions on Neural Networks
Decision-based neural networks with signal/image classification applications
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
When training machine classifiers, to replace hard classification targets by emphasized soft versions of them helps to reduce the negative effects of using standard cost functions as approximations to misclassification rates. This emphasis has the same kind of effect as sample editing methods, that have proved to be effective for improving classifiers performance. In this paper, we explore the effectiveness of using emphasized soft targets with generative models, such as Gaussian MixtureModels (GMM), and Gaussian Processes (GP). The interest of using GMMis that they offer advantages such as an easy interpretation and straightforward possibilities to deal with missing values. With respect to GP, if we use soft targets, we do not need to resort to any complex approximation to get a Gaussian Process classifier and, simultaneously, we can obtain the advantages provided by the use of an emphasis. Simulation results support the usefulness of the proposed approach to get better performance and show a low sensitivity to design parameters selection.