Off-the-peg and bespoke classifiers for fraud detection
Computational Statistics & Data Analysis
An evaluation of dimension reduction techniques for one-class classification
Artificial Intelligence Review
An evaluation of one-class classification techniques for speaker verification
Artificial Intelligence Review
Minimum spanning tree based one-class classifier
Neurocomputing
Parameter optimization of Kernel-based one-class classifier on imbalance text learning
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
Incremental one-class learning with bounded computational complexity
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Pruned random subspace method for one-class classifiers
MCS'11 Proceedings of the 10th international conference on Multiple classifier systems
Hi-index | 0.00 |
Model selection in unsupervised learning is a hard problem. In this paper a simple selection criterion for hyper-parameters in one-class classifiers (OCCs) is proposed. It makes use of the particular structure of the one-class problem. The mean idea is that the complexity of the classifier is increased until the classifier becomes inconsistent on the target class. This defines the most complex classifier which can still reliably be trained on the data. Experiments indicated the usefulness of the approach.