Communications of the ACM
The nature of statistical learning theory
The nature of statistical learning theory
Support vector domain description
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Linear Programming Boosting via Column Generation
Machine Learning
Constructing Boosting Algorithms from SVMs: An Application to One-Class Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Regularized principal manifolds
The Journal of Machine Learning Research
Support Vector Data Description
Machine Learning
Estimating the Support of a High-Dimensional Distribution
Neural Computation
Neural Computation
Learning linear PCA with convex semi-definite programming
Pattern Recognition
Selection-fusion approach for classification of datasets with missing values
Pattern Recognition
Learning Linear and Nonlinear PCA with Linear Programming
Neural Processing Letters
Some marginal learning algorithms for unsupervised problems
ISI'05 Proceedings of the 2005 IEEE international conference on Intelligence and Security Informatics
Approximate polytope ensemble for one-class classification
Pattern Recognition
Hi-index | 0.01 |
In this paper, each one-class problem is regarded as trying to estimate a function that is positive on a desired slab and negative on the complement. The main advantage of this viewpoint is that the loss function and the expected risk can be defined to ensure that the slab can contain as many samples as possible. Inspired by the nature of SVMs, the intuitive margin is also defined. As a result, a new linear optimization problem to maximize the margin and some theoretically motivated learning algorithms are obtained. Moreover, the proposed algorithms can be implemented by boosting techniques to solve nonlinear one-class classifications.