Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
Cost-Sensitive Learning by Cost-Proportionate Example Weighting
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Novelty detection: a review—part 1: statistical approaches
Signal Processing
Novelty detection: a review—part 2: neural network based approaches
Signal Processing
A Survey of Outlier Detection Methodologies
Artificial Intelligence Review
An overview of anomaly detection techniques: Existing solutions and latest technological trends
Computer Networks: The International Journal of Computer and Telecommunications Networking
Introduction to Nonparametric Estimation
Introduction to Nonparametric Estimation
ACM Computing Surveys (CSUR)
A comprehensive survey of numeric and symbolic outlier mining techniques
Intelligent Data Analysis
The foundations of cost-sensitive learning
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Semi-Supervised Novelty Detection
The Journal of Machine Learning Research
Neyman-Pearson Classification, Convexity and Stochastic Constraints
The Journal of Machine Learning Research
Minimax nonparametric classification .I. Rates of convergence
IEEE Transactions on Information Theory
A Neyman-Pearson approach to statistical learning
IEEE Transactions on Information Theory
Performance Measures for Neyman–Pearson Classification
IEEE Transactions on Information Theory
Hi-index | 0.00 |
The Neyman-Pearson (NP) paradigm in binary classification treats type I and type II errors with different priorities. It seeks classifiers that minimize type II error, subject to a type I error constraint under a user specified level α. In this paper, plug-in classifiers are developed under the NP paradigm. Based on the fundamental Neyman-Pearson Lemma, we propose two related plug-in classifiers which amount to thresholding respectively the class conditional density ratio and the regression function. These two classifiers handle different sampling schemes. This work focuses on theoretical properties of the proposed classifiers; in particular, we derive oracle inequalities that can be viewed as finite sample versions of risk bounds. NP classification can be used to address anomaly detection problems, where asymmetry in errors is an intrinsic property. As opposed to a common practice in anomaly detection that consists of thresholding normal class density, our approach does not assume a specific form for anomaly distributions. Such consideration is particularly necessary when the anomaly class density is far from uniformly distributed.