Communications of the ACM
Autoassociator-based models for speaker verification
Pattern Recognition Letters
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
A statistical learning learning model of text classification for support vector machines
Proceedings of the 24th annual international ACM SIGIR conference on Research and development in information retrieval
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Partially Supervised Classification of Text Documents
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Support Vector Machine Active Learning with Application sto Text Classification
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
PEBL: positive example based learning for Web page classification using SVM
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
One-class svms for document classification
The Journal of Machine Learning Research
Uniform object generation for optimizing one-class classifiers
The Journal of Machine Learning Research
Advances in Component Based Face Detection
AMFG '03 Proceedings of the IEEE International Workshop on Analysis and Modeling of Faces and Gestures
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Estimating the Support of a High-Dimensional Distribution
Neural Computation
Neural Computation
SVMC: single-class classification with support vector machines
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
A neural network-based model for paper currency recognition and verification
IEEE Transactions on Neural Networks
Learning classifiers from only positive and unlabeled data
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
OcVFDT: one-class very fast decision tree for one-class classification of data streams
Proceedings of the Third International Workshop on Knowledge Discovery from Sensor Data
Multi-modality in one-class classification
Proceedings of the 19th international conference on World wide web
A survey of recent trends in one class classification
AICS'09 Proceedings of the 20th Irish conference on Artificial intelligence and cognitive science
ACS'06 Proceedings of the 6th WSEAS international conference on Applied computer science
A parallel genetic programming for single class classification
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
Learning from data streams with only positive and unlabeled data
Journal of Intelligent Information Systems
Hi-index | 0.00 |
Single-Class Classification (SCC) seeks to distinguish one class of data from universal set of multiple classes. We call the target class positive and the complement set of samples negative. In SCC problems, it is assumed that a reasonable sample of the negative data is not available. SCC problems are prevalent in the real world where positive and unlabeled data are widely available but negative data are hard or expensive to acquire. We present an SCC algorithm called Mapping Convergence (MC) that computes an accurate boundary of the target class from positive and unlabeled data (without labeled negative data). The basic idea of MC is to exploit the natural "gap" between positive and negative data by incrementally labeling negative data from the unlabeled data using the margin maximization property of SVM. We also present Support Vector Mapping Convergence (SVMC) which optimizes the MC algorithm for fast training. Our analyses show that MC and SVMC without labeled negative data significantly outperform other SCC methods. They generate as accurate boundaries as standard SVM with fully labeled data when the positive data is not very under-sampled and there exist gaps between positive and negative classes in the feature space. Our results also show that SVMC trains much faster than MC with very close accuracy.