Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
MetaCost: a general method for making classifiers cost-sensitive
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
A Simple Approach to Ordinal Classification
EMCL '01 Proceedings of the 12th European Conference on Machine Learning
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
The evidence framework applied to classification networks
Neural Computation
Transductive machine learning for reliable medical diagnostics
Journal of Medical Systems - Special issue: Computer-based medical systems
Quality assessment of individual classifications in machine learning and data mining
Knowledge and Information Systems
Transductive reliability estimation for medical diagnosis
Artificial Intelligence in Medicine
Hi-index | 0.01 |
In many classification problems, it is desirable to have estimates of conditional class probabilities rather than just "hard" class predictions. Many algorithms specifically designed for this purpose exist; here, we present a way in which hard classification algorithms may be applied to this problem without modification. The main idea is that by stochastically changing the class labels in the training data in a simple way, a classification algorithm may be used for estimating any contour of the conditional class probability function. The method has been tested on a toy problem and a problem with real-world data; both experiments yielded encouraging results.