Detection of abrupt changes: theory and application
Detection of abrupt changes: theory and application
Efficient algorithms for mining outliers from large data sets
SIGMOD '00 Proceedings of the 2000 ACM SIGMOD international conference on Management of data
Novelty detection: a review—part 1: statistical approaches
Signal Processing
Estimating the Support of a High-Dimensional Distribution
Neural Computation
The Journal of Machine Learning Research
Consistency and Convergence Rates of One-Class SVMs and Related Algorithms
The Journal of Machine Learning Research
Feature extraction for one-class classification
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Hi-index | 0.01 |
We propose a novel non-parametric adaptive outlier detection algorithm, called LPE, for high dimensional data based on score functions derived from nearest neighbor graphs on n-point nominal data. Outliers are predicted whenever the score of a test sample falls below α, which is supposed to be the desired false alarm level. The resulting outlier detector is shown to be asymptotically optimal in that it is uniformly most powerful for the specified false alarm level, α, for the case when the density associated with the outliers is a mixture of the nominal and a known density. Our algorithm is computationally efficient, being linear in dimension and quadratic in data size. The whole empirical Receiving Operating Characteristics (ROC) curve can be derived with almost no additional cost based on the estimated score function. It does not require choosing complicated tuning parameters or function approximation classes and it can adapt to local structure such as local change in dimensionality by incorporating the technique of manifold learning. We demonstrate the algorithm on both artificial and real data sets in high dimensional feature spaces.