A probabilistic resource allocating network for novelty detection
Neural Computation
Pattern Recognition Letters
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Support Vector Machines for Classification in Nonstandard Situations
Machine Learning
Local Expert Autoassociators for Anomaly Detection
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Support Vector Data Description
Machine Learning
The Journal of Machine Learning Research
Consistency and Convergence Rates of One-Class SVMs and Related Algorithms
The Journal of Machine Learning Research
Neural Computation
A Small Sphere and Large Margin Approach for Novelty Detection Using Training Data with Outliers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
One-Class Support Vector Machine employs a grid parameter selection process to discover the best parameters for a given data set. It is assumed that two separate trade-off parameters are assigned to normal and abnormal data samples, respectively. However, this assumption is not always true because data samples have different contributions to the construction of hypersphere or hyperplane decision boundary. In this paper, we introduce a new iterative learning process that is carried out right after the grid parameter selection process to refine the trade-off parameter value for each sample. In this learning process, a weight is assigned to each sample to represent the contribution of that sample and is iteratively refined. Experimental results performed on a number of data sets show a better performance for the proposed approach.