The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Support vector domain description
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Constructing Boosting Algorithms from SVMs: An Application to One-Class Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Outlier Detection Using Classifier Instability
SSPR '98/SPR '98 Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
One-class svms for document classification
The Journal of Machine Learning Research
Uniform object generation for optimizing one-class classifiers
The Journal of Machine Learning Research
Novelty detection: a review—part 1: statistical approaches
Signal Processing
Novelty detection: a review—part 2: neural network based approaches
Signal Processing
Estimating the Support of a High-Dimensional Distribution
Neural Computation
On the sparseness of 1-norm support vector machines
Neural Networks
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Network intrusion and fault detection: a statistical anomaly approach
IEEE Communications Magazine
Hi-index | 0.00 |
This paper proposes a 1-norm support vector novelty detection (SVND) method and discusses its sparseness. 1-norm SVND is formulated as a linear programming problem and uses two techniques for inducing sparseness, or the 1-norm regularization and the hinge loss function. We also find two upper bounds on the sparseness of 1-norm SVND, or exact support vector (ESV) and kernel Gram matrix rank bounds. The ESV bound indicates that 1-norm SVND has a sparser representation model than SVND. The kernel Gram matrix rank bound can loosely estimate the sparseness of 1-norm SVND. Experimental results show that 1-norm SVND is feasible and effective.