Machine Learning
Neural networks for pattern recognition
Neural networks for pattern recognition
Self-Organizing Maps
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Estimating the Support of a High-Dimensional Distribution
Neural Computation
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
On the History of the Minimum Spanning Tree Problem
IEEE Annals of the History of Computing
Minimum spanning tree based one-class classifier
Neurocomputing
Ensembles of One Class Support Vector Machines
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
Bagging classifiers for fighting poisoning attacks in adversarial classification tasks
MCS'11 Proceedings of the 10th international conference on Multiple classifier systems
Selective ensemble of support vector data descriptions for novelty detection
ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
Adversarial attacks against intrusion detection systems: Taxonomy, solutions and open issues
Information Sciences: an International Journal
Combining one-class classifiers via meta learning
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Hi-index | 0.00 |
Most conventional learning algorithms require both positive and negative training data for achieving accurate classification results. However, the problem of learning classifiers from only positive data arises in many applications where negative data are too costly, difficult to obtain, or not available at all. Minimum Spanning Tree Class Descriptor (MST_CD) was presented as a method that achieves better accuracies than other one-class classifiers in high dimensional data. However, the presence of outliers in the target class severely harms the performance of this classifier. In this paper we propose two bagging strategies for MST_CD that reduce the influence of outliers in training data. We show the improved performance on both real and artificially contaminated data.